You are on page 1of 438

MCT USE ONLY.

STUDENT USE PROHIBITED


O F F I C I A L M I C R O S O F T L E A R N I N G P R O D U C T

20466C
Implementing Data Models and Reports
with Microsoft SQL Server
MCT USE ONLY. STUDENT USE PROHIBITED
ii Implementing Data Models and Reports with Microsoft SQL Server 2012

Information in this document, including URL and other Internet Web site references, is subject to change
without notice. Unless otherwise noted, the example companies, organizations, products, domain names,
e-mail addresses, logos, people, places, and events depicted herein are fictitious, and no association with
any real company, organization, product, domain name, e-mail address, logo, person, place or event is
intended or should be inferred. Complying with all applicable copyright laws is the responsibility of the
user. Without limiting the rights under copyright, no part of this document may be reproduced, stored in
or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical,
photocopying, recording, or otherwise), or for any purpose, without the express written permission of
Microsoft Corporation.

Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property
rights covering subject matter in this document. Except as expressly provided in any written license
agreement from Microsoft, the furnishing of this document does not give you any license to these
patents, trademarks, copyrights, or other intellectual property.

The names of manufacturers, products, or URLs are provided for informational purposes only and
Microsoft makes no representations and warranties, either expressed, implied, or statutory, regarding
these manufacturers or the use of the products with any Microsoft technologies. The inclusion of a
manufacturer or product does not imply endorsement of Microsoft of the manufacturer or product. Links
may be provided to third party sites. Such sites are not under the control of Microsoft and Microsoft is not
responsible for the contents of any linked site or any link contained in a linked site, or any changes or
updates to such sites. Microsoft is not responsible for webcasting or any other form of transmission
received from any linked site. Microsoft is providing these links to you only as a convenience, and the
inclusion of any link does not imply endorsement of Microsoft of the site or the products contained
therein.
2014 Microsoft Corporation. All rights reserved.

Microsoft and the trademarks listed at


http://www.microsoft.com/about/legal/en/us/IntellectualProperty/Trademarks/EN-US.aspx are trademarks
of the Microsoft group of companies. All other trademarks are property of their respective owners

Product Number: 20466C

Part Number (if applicable): X19-32477

Released: 07/2014
MCT USE ONLY. STUDENT USE PROHIBITED
MICROSOFT LICENSE TERMS
MICROSOFT INSTRUCTOR-LED COURSEWARE

These license terms are an agreement between Microsoft Corporation (or based on where you live, one of its
affiliates) and you. Please read them. They apply to your use of the content accompanying this agreement which
includes the media on which you received it, if any. These license terms also apply to Trainer Content and any
updates and supplements for the Licensed Content unless other terms accompany those items. If so, those terms
apply.

BY ACCESSING, DOWNLOADING OR USING THE LICENSED CONTENT, YOU ACCEPT THESE TERMS.
IF YOU DO NOT ACCEPT THEM, DO NOT ACCESS, DOWNLOAD OR USE THE LICENSED CONTENT.

If you comply with these license terms, you have the rights below for each license you acquire.

1. DEFINITIONS.

a. Authorized Learning Center means a Microsoft IT Academy Program Member, Microsoft Learning
Competency Member, or such other entity as Microsoft may designate from time to time.

b. Authorized Training Session means the instructor-led training class using Microsoft Instructor-Led
Courseware conducted by a Trainer at or through an Authorized Learning Center.

c. Classroom Device means one (1) dedicated, secure computer that an Authorized Learning Center owns
or controls that is located at an Authorized Learning Centers training facilities that meets or exceeds the
hardware level specified for the particular Microsoft Instructor-Led Courseware.

d. End User means an individual who is (i) duly enrolled in and attending an Authorized Training Session
or Private Training Session, (ii) an employee of a MPN Member, or (iii) a Microsoft full-time employee.

e. Licensed Content means the content accompanying this agreement which may include the Microsoft
Instructor-Led Courseware or Trainer Content.

f. Microsoft Certified Trainer or MCT means an individual who is (i) engaged to teach a training session
to End Users on behalf of an Authorized Learning Center or MPN Member, and (ii) currently certified as a
Microsoft Certified Trainer under the Microsoft Certification Program.

g. Microsoft Instructor-Led Courseware means the Microsoft-branded instructor-led training course that
educates IT professionals and developers on Microsoft technologies. A Microsoft Instructor-Led
Courseware title may be branded as MOC, Microsoft Dynamics or Microsoft Business Group courseware.

h. Microsoft IT Academy Program Member means an active member of the Microsoft IT Academy
Program.

i. Microsoft Learning Competency Member means an active member of the Microsoft Partner Network
program in good standing that currently holds the Learning Competency status.

j. MOC means the Official Microsoft Learning Product instructor-led courseware known as Microsoft
Official Course that educates IT professionals and developers on Microsoft technologies.

k. MPN Member means an active Microsoft Partner Network program member in good standing.
MCT USE ONLY. STUDENT USE PROHIBITED
l. Personal Device means one (1) personal computer, device, workstation or other digital electronic device
that you personally own or control that meets or exceeds the hardware level specified for the particular
Microsoft Instructor-Led Courseware.

m. Private Training Session means the instructor-led training classes provided by MPN Members for
corporate customers to teach a predefined learning objective using Microsoft Instructor-Led Courseware.
These classes are not advertised or promoted to the general public and class attendance is restricted to
individuals employed by or contracted by the corporate customer.

n. Trainer means (i) an academically accredited educator engaged by a Microsoft IT Academy Program
Member to teach an Authorized Training Session, and/or (ii) a MCT.

o. Trainer Content means the trainer version of the Microsoft Instructor-Led Courseware and additional
supplemental content designated solely for Trainers use to teach a training session using the Microsoft
Instructor-Led Courseware. Trainer Content may include Microsoft PowerPoint presentations, trainer
preparation guide, train the trainer materials, Microsoft One Note packs, classroom setup guide and Pre-
release course feedback form. To clarify, Trainer Content does not include any software, virtual hard
disks or virtual machines.

2. USE RIGHTS. The Licensed Content is licensed not sold. The Licensed Content is licensed on a one copy
per user basis, such that you must acquire a license for each individual that accesses or uses the Licensed
Content.

2.1 Below are five separate sets of use rights. Only one set of rights apply to you.

a. If you are a Microsoft IT Academy Program Member:


i. Each license acquired on behalf of yourself may only be used to review one (1) copy of the Microsoft
Instructor-Led Courseware in the form provided to you. If the Microsoft Instructor-Led Courseware is
in digital format, you may install one (1) copy on up to three (3) Personal Devices. You may not
install the Microsoft Instructor-Led Courseware on a device you do not own or control.
ii. For each license you acquire on behalf of an End User or Trainer, you may either:
1. distribute one (1) hard copy version of the Microsoft Instructor-Led Courseware to one (1) End
User who is enrolled in the Authorized Training Session, and only immediately prior to the
commencement of the Authorized Training Session that is the subject matter of the Microsoft
Instructor-Led Courseware being provided, or
2. provide one (1) End User with the unique redemption code and instructions on how they can
access one (1) digital version of the Microsoft Instructor-Led Courseware, or
3. provide one (1) Trainer with the unique redemption code and instructions on how they can
access one (1) Trainer Content,
provided you comply with the following:
iii. you will only provide access to the Licensed Content to those individuals who have acquired a valid
license to the Licensed Content,
iv. you will ensure each End User attending an Authorized Training Session has their own valid licensed
copy of the Microsoft Instructor-Led Courseware that is the subject of the Authorized Training
Session,
v. you will ensure that each End User provided with the hard-copy version of the Microsoft Instructor-
Led Courseware will be presented with a copy of this agreement and each End User will agree that
their use of the Microsoft Instructor-Led Courseware will be subject to the terms in this agreement
prior to providing them with the Microsoft Instructor-Led Courseware. Each individual will be required
to denote their acceptance of this agreement in a manner that is enforceable under local law prior to
their accessing the Microsoft Instructor-Led Courseware,
vi. you will ensure that each Trainer teaching an Authorized Training Session has their own valid
licensed copy of the Trainer Content that is the subject of the Authorized Training Session,
MCT USE ONLY. STUDENT USE PROHIBITED
vii. you will only use qualified Trainers who have in-depth knowledge of and experience with the
Microsoft technology that is the subject of the Microsoft Instructor-Led Courseware being taught for
all your Authorized Training Sessions,
viii. you will only deliver a maximum of 15 hours of training per week for each Authorized Training
Session that uses a MOC title, and
ix. you acknowledge that Trainers that are not MCTs will not have access to all of the trainer resources
for the Microsoft Instructor-Led Courseware.

b. If you are a Microsoft Learning Competency Member:


i. Each license acquired on behalf of yourself may only be used to review one (1) copy of the Microsoft
Instructor-Led Courseware in the form provided to you. If the Microsoft Instructor-Led Courseware is
in digital format, you may install one (1) copy on up to three (3) Personal Devices. You may not
install the Microsoft Instructor-Led Courseware on a device you do not own or control.
ii. For each license you acquire on behalf of an End User or Trainer, you may either:
1. distribute one (1) hard copy version of the Microsoft Instructor-Led Courseware to one (1) End
User attending the Authorized Training Session and only immediately prior to the
commencement of the Authorized Training Session that is the subject matter of the Microsoft
Instructor-Led Courseware provided, or
2. provide one (1) End User attending the Authorized Training Session with the unique redemption
code and instructions on how they can access one (1) digital version of the Microsoft Instructor-
Led Courseware, or
3. you will provide one (1) Trainer with the unique redemption code and instructions on how they
can access one (1) Trainer Content,
provided you comply with the following:
iii. you will only provide access to the Licensed Content to those individuals who have acquired a valid
license to the Licensed Content,
iv. you will ensure that each End User attending an Authorized Training Session has their own valid
licensed copy of the Microsoft Instructor-Led Courseware that is the subject of the Authorized
Training Session,
v. you will ensure that each End User provided with a hard-copy version of the Microsoft Instructor-Led
Courseware will be presented with a copy of this agreement and each End User will agree that their
use of the Microsoft Instructor-Led Courseware will be subject to the terms in this agreement prior to
providing them with the Microsoft Instructor-Led Courseware. Each individual will be required to
denote their acceptance of this agreement in a manner that is enforceable under local law prior to
their accessing the Microsoft Instructor-Led Courseware,
vi. you will ensure that each Trainer teaching an Authorized Training Session has their own valid
licensed copy of the Trainer Content that is the subject of the Authorized Training Session,
vii. you will only use qualified Trainers who hold the applicable Microsoft Certification credential that is
the subject of the Microsoft Instructor-Led Courseware being taught for your Authorized Training
Sessions,
viii. you will only use qualified MCTs who also hold the applicable Microsoft Certification credential that is
the subject of the MOC title being taught for all your Authorized Training Sessions using MOC,
ix. you will only provide access to the Microsoft Instructor-Led Courseware to End Users, and
x. you will only provide access to the Trainer Content to Trainers.
MCT USE ONLY. STUDENT USE PROHIBITED
c. If you are a MPN Member:
i. Each license acquired on behalf of yourself may only be used to review one (1) copy of the Microsoft
Instructor-Led Courseware in the form provided to you. If the Microsoft Instructor-Led Courseware is
in digital format, you may install one (1) copy on up to three (3) Personal Devices. You may not
install the Microsoft Instructor-Led Courseware on a device you do not own or control.
ii. For each license you acquire on behalf of an End User or Trainer, you may either:
1. distribute one (1) hard copy version of the Microsoft Instructor-Led Courseware to one (1) End
User attending the Private Training Session, and only immediately prior to the commencement
of the Private Training Session that is the subject matter of the Microsoft Instructor-Led
Courseware being provided, or
2. provide one (1) End User who is attending the Private Training Session with the unique
redemption code and instructions on how they can access one (1) digital version of the
Microsoft Instructor-Led Courseware, or
3. you will provide one (1) Trainer who is teaching the Private Training Session with the unique
redemption code and instructions on how they can access one (1) Trainer Content,
provided you comply with the following:
iii. you will only provide access to the Licensed Content to those individuals who have acquired a valid
license to the Licensed Content,
iv. you will ensure that each End User attending an Private Training Session has their own valid licensed
copy of the Microsoft Instructor-Led Courseware that is the subject of the Private Training Session,
v. you will ensure that each End User provided with a hard copy version of the Microsoft Instructor-Led
Courseware will be presented with a copy of this agreement and each End User will agree that their
use of the Microsoft Instructor-Led Courseware will be subject to the terms in this agreement prior to
providing them with the Microsoft Instructor-Led Courseware. Each individual will be required to
denote their acceptance of this agreement in a manner that is enforceable under local law prior to
their accessing the Microsoft Instructor-Led Courseware,
vi. you will ensure that each Trainer teaching an Private Training Session has their own valid licensed
copy of the Trainer Content that is the subject of the Private Training Session,
vii. you will only use qualified Trainers who hold the applicable Microsoft Certification credential that is
the subject of the Microsoft Instructor-Led Courseware being taught for all your Private Training
Sessions,
viii. you will only use qualified MCTs who hold the applicable Microsoft Certification credential that is the
subject of the MOC title being taught for all your Private Training Sessions using MOC,
ix. you will only provide access to the Microsoft Instructor-Led Courseware to End Users, and
x. you will only provide access to the Trainer Content to Trainers.

d. If you are an End User:


For each license you acquire, you may use the Microsoft Instructor-Led Courseware solely for your
personal training use. If the Microsoft Instructor-Led Courseware is in digital format, you may access the
Microsoft Instructor-Led Courseware online using the unique redemption code provided to you by the
training provider and install and use one (1) copy of the Microsoft Instructor-Led Courseware on up to
three (3) Personal Devices. You may also print one (1) copy of the Microsoft Instructor-Led Courseware.
You may not install the Microsoft Instructor-Led Courseware on a device you do not own or control.

e. If you are a Trainer.


i. For each license you acquire, you may install and use one (1) copy of the Trainer Content in the
form provided to you on one (1) Personal Device solely to prepare and deliver an Authorized
Training Session or Private Training Session, and install one (1) additional copy on another Personal
Device as a backup copy, which may be used only to reinstall the Trainer Content. You may not
install or use a copy of the Trainer Content on a device you do not own or control. You may also
print one (1) copy of the Trainer Content solely to prepare for and deliver an Authorized Training
Session or Private Training Session.
MCT USE ONLY. STUDENT USE PROHIBITED
ii. You may customize the written portions of the Trainer Content that are logically associated with
instruction of a training session in accordance with the most recent version of the MCT agreement.
If you elect to exercise the foregoing rights, you agree to comply with the following: (i)
customizations may only be used for teaching Authorized Training Sessions and Private Training
Sessions, and (ii) all customizations will comply with this agreement. For clarity, any use of
customize refers only to changing the order of slides and content, and/or not using all the slides or
content, it does not mean changing or modifying any slide or content.

2.2 Separation of Components. The Licensed Content is licensed as a single unit and you may not
separate their components and install them on different devices.

2.3 Redistribution of Licensed Content. Except as expressly provided in the use rights above, you may
not distribute any Licensed Content or any portion thereof (including any permitted modifications) to any
third parties without the express written permission of Microsoft.

2.4 Third Party Notices. The Licensed Content may include third party code tent that Microsoft, not the
third party, licenses to you under this agreement. Notices, if any, for the third party code ntent are included
for your information only.

2.5 Additional Terms. Some Licensed Content may contain components with additional terms,
conditions, and licenses regarding its use. Any non-conflicting terms in those conditions and licenses also
apply to your use of that respective component and supplements the terms described in this agreement.

3. LICENSED CONTENT BASED ON PRE-RELEASE TECHNOLOGY. If the Licensed Contents subject


matter is based on a pre-release version of Microsoft technology (Pre-release), then in addition to the
other provisions in this agreement, these terms also apply:

a. Pre-Release Licensed Content. This Licensed Content subject matter is on the Pre-release version of
the Microsoft technology. The technology may not work the way a final version of the technology will
and we may change the technology for the final version. We also may not release a final version.
Licensed Content based on the final version of the technology may not contain the same information as
the Licensed Content based on the Pre-release version. Microsoft is under no obligation to provide you
with any further content, including any Licensed Content based on the final version of the technology.

b. Feedback. If you agree to give feedback about the Licensed Content to Microsoft, either directly or
through its third party designee, you give to Microsoft without charge, the right to use, share and
commercialize your feedback in any way and for any purpose. You also give to third parties, without
charge, any patent rights needed for their products, technologies and services to use or interface with
any specific parts of a Microsoft technology, Microsoft product, or service that includes the feedback.
You will not give feedback that is subject to a license that requires Microsoft to license its technology,
technologies, or products to third parties because we include your feedback in them. These rights
survive this agreement.

c. Pre-release Term. If you are an Microsoft IT Academy Program Member, Microsoft Learning
Competency Member, MPN Member or Trainer, you will cease using all copies of the Licensed Content on
the Pre-release technology upon (i) the date which Microsoft informs you is the end date for using the
Licensed Content on the Pre-release technology, or (ii) sixty (60) days after the commercial release of the
technology that is the subject of the Licensed Content, whichever is earliest (Pre-release term).
Upon expiration or termination of the Pre-release term, you will irretrievably delete and destroy all copies
of the Licensed Content in your possession or under your control.
MCT USE ONLY. STUDENT USE PROHIBITED
4. SCOPE OF LICENSE. The Licensed Content is licensed, not sold. This agreement only gives you some
rights to use the Licensed Content. Microsoft reserves all other rights. Unless applicable law gives you more
rights despite this limitation, you may use the Licensed Content only as expressly permitted in this
agreement. In doing so, you must comply with any technical limitations in the Licensed Content that only
allows you to use it in certain ways. Except as expressly permitted in this agreement, you may not:
access or allow any individual to access the Licensed Content if they have not acquired a valid license
for the Licensed Content,
alter, remove or obscure any copyright or other protective notices (including watermarks), branding
or identifications contained in the Licensed Content,
modify or create a derivative work of any Licensed Content,
publicly display, or make the Licensed Content available for others to access or use,
copy, print, install, sell, publish, transmit, lend, adapt, reuse, link to or post, make available or
distribute the Licensed Content to any third party,
work around any technical limitations in the Licensed Content, or
reverse engineer, decompile, remove or otherwise thwart any protections or disassemble the
Licensed Content except and only to the extent that applicable law expressly permits, despite this
limitation.

5. RESERVATION OF RIGHTS AND OWNERSHIP. Microsoft reserves all rights not expressly granted to
you in this agreement. The Licensed Content is protected by copyright and other intellectual property laws
and treaties. Microsoft or its suppliers own the title, copyright, and other intellectual property rights in the
Licensed Content.

6. EXPORT RESTRICTIONS. The Licensed Content is subject to United States export laws and regulations.
You must comply with all domestic and international export laws and regulations that apply to the Licensed
Content. These laws include restrictions on destinations, end users and end use. For additional information,
see www.microsoft.com/exporting.

7. SUPPORT SERVICES. Because the Licensed Content is as is, we may not provide support services for it.

8. TERMINATION. Without prejudice to any other rights, Microsoft may terminate this agreement if you fail
to comply with the terms and conditions of this agreement. Upon termination of this agreement for any
reason, you will immediately stop all use of and delete and destroy all copies of the Licensed Content in
your possession or under your control.

9. LINKS TO THIRD PARTY SITES. You may link to third party sites through the use of the Licensed
Content. The third party sites are not under the control of Microsoft, and Microsoft is not responsible for
the contents of any third party sites, any links contained in third party sites, or any changes or updates to
third party sites. Microsoft is not responsible for webcasting or any other form of transmission received
from any third party sites. Microsoft is providing these links to third party sites to you only as a
convenience, and the inclusion of any link does not imply an endorsement by Microsoft of the third party
site.

10. ENTIRE AGREEMENT. This agreement, and any additional terms for the Trainer Content, updates and
supplements are the entire agreement for the Licensed Content, updates and supplements.

11. APPLICABLE LAW.


a. United States. If you acquired the Licensed Content in the United States, Washington state law governs
the interpretation of this agreement and applies to claims for breach of it, regardless of conflict of laws
principles. The laws of the state where you live govern all other claims, including claims under state
consumer protection laws, unfair competition laws, and in tort.
MCT USE ONLY. STUDENT USE PROHIBITED
b. Outside the United States. If you acquired the Licensed Content in any other country, the laws of that
country apply.

12. LEGAL EFFECT. This agreement describes certain legal rights. You may have other rights under the laws
of your country. You may also have rights with respect to the party from whom you acquired the Licensed
Content. This agreement does not change your rights under the laws of your country if the laws of your
country do not permit it to do so.

13. DISCLAIMER OF WARRANTY. THE LICENSED CONTENT IS LICENSED "AS-IS" AND "AS
AVAILABLE." YOU BEAR THE RISK OF USING IT. MICROSOFT AND ITS RESPECTIVE
AFFILIATES GIVES NO EXPRESS WARRANTIES, GUARANTEES, OR CONDITIONS. YOU MAY
HAVE ADDITIONAL CONSUMER RIGHTS UNDER YOUR LOCAL LAWS WHICH THIS AGREEMENT
CANNOT CHANGE. TO THE EXTENT PERMITTED UNDER YOUR LOCAL LAWS, MICROSOFT AND
ITS RESPECTIVE AFFILIATES EXCLUDES ANY IMPLIED WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT.

14. LIMITATION ON AND EXCLUSION OF REMEDIES AND DAMAGES. YOU CAN RECOVER FROM
MICROSOFT, ITS RESPECTIVE AFFILIATES AND ITS SUPPLIERS ONLY DIRECT DAMAGES UP
TO US$5.00. YOU CANNOT RECOVER ANY OTHER DAMAGES, INCLUDING CONSEQUENTIAL,
LOST PROFITS, SPECIAL, INDIRECT OR INCIDENTAL DAMAGES.

This limitation applies to


o anything related to the Licensed Content, services, content (including code) on third party Internet
sites or third-party programs; and
o claims for breach of contract, breach of warranty, guarantee or condition, strict liability, negligence,
or other tort to the extent permitted by applicable law.

It also applies even if Microsoft knew or should have known about the possibility of the damages. The
above limitation or exclusion may not apply to you because your country may not allow the exclusion or
limitation of incidental, consequential or other damages.

Please note: As this Licensed Content is distributed in Quebec, Canada, some of the clauses in this
agreement are provided below in French.

Remarque : Ce le contenu sous licence tant distribu au Qubec, Canada, certaines des clauses
dans ce contrat sont fournies ci-dessous en franais.

EXONRATION DE GARANTIE. Le contenu sous licence vis par une licence est offert tel quel . Toute
utilisation de ce contenu sous licence est votre seule risque et pril. Microsoft naccorde aucune autre garantie
expresse. Vous pouvez bnficier de droits additionnels en vertu du droit local sur la protection dues
consommateurs, que ce contrat ne peut modifier. La ou elles sont permises par le droit locale, les garanties
implicites de qualit marchande, dadquation un usage particulier et dabsence de contrefaon sont exclues.

LIMITATION DES DOMMAGES-INTRTS ET EXCLUSION DE RESPONSABILIT POUR LES


DOMMAGES. Vous pouvez obtenir de Microsoft et de ses fournisseurs une indemnisation en cas de dommages
directs uniquement hauteur de 5,00 $ US. Vous ne pouvez prtendre aucune indemnisation pour les autres
dommages, y compris les dommages spciaux, indirects ou accessoires et pertes de bnfices.
Cette limitation concerne:
tout ce qui est reli au le contenu sous licence, aux services ou au contenu (y compris le code)
figurant sur des sites Internet tiers ou dans des programmes tiers; et.
les rclamations au titre de violation de contrat ou de garantie, ou au titre de responsabilit
stricte, de ngligence ou dune autre faute dans la limite autorise par la loi en vigueur.
MCT USE ONLY. STUDENT USE PROHIBITED
Elle sapplique galement, mme si Microsoft connaissait ou devrait connatre lventualit dun tel dommage. Si
votre pays nautorise pas lexclusion ou la limitation de responsabilit pour les dommages indirects, accessoires
ou de quelque nature que ce soit, il se peut que la limitation ou lexclusion ci-dessus ne sappliquera pas votre
gard.

EFFET JURIDIQUE. Le prsent contrat dcrit certains droits juridiques. Vous pourriez avoir dautres droits
prvus par les lois de votre pays. Le prsent contrat ne modifie pas les droits que vous confrent les lois de votre
pays si celles-ci ne le permettent pas.

Revised July 2013


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 2012 xi
MCT USE ONLY. STUDENT USE PROHIBITED
xii Implementing Data Models and Reports with Microsoft SQL Server 2012

Acknowledgments
Microsoft Learning would like to acknowledge and thank the following for their contribution towards
developing this title. Their effort at various stages in the development has ensured that you have a good
classroom experience.

Graeme Malcolm Lead Content Developer


Graeme Malcolm is a Microsoft SQL Server subject matter expert and professional content developer at
Content Mastera division of CM Group Ltd. As a Microsoft Certified Trainer, Graeme has delivered
training courses on SQL Server since version 4.2; as an author, Graeme has written numerous books,
articles, and training courses on SQL Server; and as a consultant, Graeme has designed and implemented
business solutions based on SQL Server for customers all over the world.

Chris Testa-ONeill Technical Reviewer


Chris Testa-ONeill is a SQL Server Microsoft Most Valuable Professional (MVP), Microsoft Certified Trainer
and independent DBA and SQL Server Business Intelligence consultant at Claribi. He is a regular speaker
on the international circuit and runs the Manchester (UK) SQL Server User Group, SQLBits and co-founder
of SQLRelay. Chris is a Microsoft Certified Trainer (MCT), MCDBA, MCTS, MCITP, MCSA and MCSE in SQL
Server. He can be contacted at chris@claribi.com or @ctesta_oneill.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 2012 xiii

Contents
Module 1: Introduction to Business Intelligence and Data Modeling
Module Overview 1-1

Lesson 1: Elements of an Enterprise BI Solution 1-2

Lesson 2: The Microsoft Enterprise BI Platform 1-10

Lesson 3: Planning an Enterprise BI Project 1-14

Lab: Exploring a BI Solution 1-21

Module Review and Takeaways 1-25

Module 2: Creating Multidimensional Databases


Module Overview 2-1
Lesson 1: Introduction to Multidimensional Analysis 2-3

Lesson 2: Creating Data Sources and Data Source Views 2-9

Lesson 3: Creating a Cube 2-13


Lesson 4: Overview of Cube Security 2-17

Lab: Creating a Multidimensional Database 2-23

Module Review and Takeaways 2-28

Module 3: Working with Cubes and Dimensions


Module Overview 3-1

Lesson 1: Configuring Dimensions 3-2

Lesson 2: Defining Attribute Hierarchies 3-8

Lesson 3: Sorting and Grouping Attributes 3-14

Lab: Defining Dimensions 3-16

Module Review and Takeaways 3-24

Module 4: Working with Measures and Measure Groups


Module Overview 4-1

Lesson 1: Working with Measures 4-2

Lesson 2: Working with Measure Groups 4-6


Lab: Configuring Measures and Measure Groups 4-13

Module Review and Takeaways 4-17


MCT USE ONLY. STUDENT USE PROHIBITED
xiv Implementing Data Models and Reports with Microsoft SQL Server 2012

Module 5: Introduction to MDX


Module Overview 5-1

Lesson 1: MDX Fundamentals 5-2

Lesson 2: Adding Calculations to a Cube 5-6

Lesson 3: Using MDX to Query a Cube 5-13

Lab: Using MDX 5-18

Module Review and Takeaways 5-21

Module 6: Enhancing a Cube


Module Overview 6-1

Lesson 1: Working with Key Performance Indicators 6-2

Lesson 2: Working with Actions 6-7

Lesson 3: Working with Perspectives 6-11

Lesson 4: Working with Translations 6-13


Lab: Customizing a Cube 6-15

Module Review and Takeaways 6-19

Module 7: Implementing an Analysis Services Tabular Data Model


Module Overview 7-1
Lesson 1: Introduction to Analysis Services Tabular Data Models 7-2

Lesson 2: Creating a Tabular Data Model 7-7

Lesson 3: Using an Analysis Services Tabular Data Model in the Enterprise 7-16
Lab: Implementing an Analysis Services Tabular Data Model 7-23

Module Review and Takeaways 7-32

Module 8: Introduction to DAX


Module Overview 8-1

Lesson 1: DAX Fundamentals 8-2

Lesson 2: Enhancing a Tabular Data Model with DAX 8-9

Lab: Using DAX to Enhance a Tabular Data Model 8-20

Module Review and Takeaways 8-26


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 2012 xv

Module 9: Implementing Reports with SQL Server Reporting Services


Module Overview 9-1

Lesson 1: Introduction to Reporting Services 9-2

Lesson 2: Creating a Report with Report Designer 9-6

Lesson 3: Grouping and Aggregating Data in a Report 9-16

Lesson 4: Publishing and Viewing a Report 9-23

Lab: Creating a Report with Report Designer 9-27

Module Review and Takeaways 9-32

Module 10: Enhancing Reports with SQL Server Reporting Services


Module Overview 10-1

Lesson 1: Showing Data Graphically 10-2

Lesson 2: Filtering Reports by Using Parameters 10-10

Lab: Enhancing a Report 10-16


Module Review and Takeaways 10-23

Module 11: Managing Report Execution and Delivery


Module Overview 11-1

Lesson 1: Managing Report Security 11-2


Lesson 2: Managing Report Execution 11-6

Lesson 3: Subscriptions and Data Alerts 11-10

Lesson 4: Troubleshooting Reporting Services 11-17


Lab: Configuring Report Execution and Delivery 11-20

Module Review and Takeaways 11-24

Module 12: Delivering BI with SharePoint PerformancePoint Services


Module Overview 12-1

Lesson 1: Introduction to SharePoint Server as a BI Platform 12-2

Lesson 2: Introduction to PerformancePoint Services 12-9

Lesson 3: PerformancePoint Data Sources and Time Intelligence 12-12

Lesson 4: Reports, Scorecards, and Dashboards 12-16

Lab: Implementing a SharePoint Server BI Solution 12-23

Module Review and Takeaways 12-29


MCT USE ONLY. STUDENT USE PROHIBITED
xvi Implementing Data Models and Reports with Microsoft SQL Server 2012

Module 13: Performing Predictive Analysis with Data Mining


Module Overview 13-1

Lesson 1: Overview of Data Mining 13-2

Lesson 2: Creating a Data Mining Solution 13-8

Lesson 3: Validating a Data Mining Model 13-12

Lesson 4: Consuming Data Mining Data 13-17

Lab: Using Data Mining to Support a Marketing Campaign 13-21

Module Review and Takeaways 13-26

Lab Answer Keys


Module 1 Lab: Exploring a BI Solution L01-1

Module 2 Lab: Creating a Multidimensional Database L02-1

Module 3 Lab: Defining Dimensions L03-1

Module 4 Lab: Configuring Measures and Measure Groups L04-1


Module 5 Lab: Using MDX L05-1

Module 6 Lab: Customizing a Cube L06-1

Module 7 Lab: Implementing an Analysis Services Tabular Data Model L07-1

Module 8 Lab: Using DAX to Enhance a Tabular Data Model L08-1

Module 9 Lab: Creating a Report with Report Designer L09-1

Module 10 Lab: Enhancing a Report L10-1


Module 11 Lab: Configuring Report Execution and Delivery L11-1

Module 12 Lab: Implementing a SharePoint Server BI Solution L12-1

Module 13 Lab: Using Data Mining to Support a Marketing Campaign L13-1


MCT USE ONLY. STUDENT USE PROHIBITED
About This Course xvii

About This Course


This section provides you with a brief description of the course, audience, suggested prerequisites, and
course objectives.

Course Description
The focus of this course is on creating managed enterprise BI solutions. It describes how to implement
multidimensional and tabular data models, deliver reports with Microsoft SQL Server Reporting
Services, create dashboards with Microsoft SharePoint Server PerformancePoint Services, and discover
business insights by using data mining.

Audience
This course is intended for database professionals who need to fulfill a Business (BI) Intelligence Developer
role to create analysis and reporting solutions. Primary responsibilities include:

Implementing analytical data models, such as OLAP cubes.

Implementing reports, and managing report delivery.


Creating business performance dashboards.

Supporting data mining and predictive analysis.

Student Prerequisites
This course requires that you meet the following prerequisites:

At least 2 years experience of working with relational databases, including:


Designing a normalized database.

Creating tables and relationships.

Querying with Transact-SQL.


Some basic knowledge of data warehouse schema topology (including star and snowflake schemas).

Some exposure to basic programming constructs (such as looping and branching).

An awareness of key business priorities such as revenue, profitability, and financial accounting is
desirable.

Course Objectives
After completing this course, students will be able to:

Describe the components, architecture, and nature of a BI solution.

Create a multidimensional database with Analysis Services.


Implement dimensions in a cube.

Implement measures and measure groups in a cube.

Use MDX Syntax.

Customize a cube.

Implement a Tabular Data Model in SQL Server Analysis Services.


MCT USE ONLY. STUDENT USE PROHIBITED
xviii About This Course

Use DAX to enhance a tabular model.

Create reports with Reporting Services.

Enhance reports with charts and parameters.

Manage report execution and delivery.

Implement a dashboard in SharePoint Server with PerformancePoint Services.

Use Data Mining for Predictive Analysis.

Course Outline
This section provides an outline of the course:

Module 1, Introduction to Business Intelligence and Data Modeling

Module 2, Creating Multidimensional Databases


Module 3, Working with Cubes and Dimensions

Module 4, Working with Measures and Measure Groups

Module 5, Introduction to MDX

Module 6, Enhancing a Cube

Module 7, Implementing an Analysis Services Tabular Data Model

Module 8, Introduction to DAX


Module 9, Implementing Reports with SQL Server Reporting Services

Module 10, Enhancing Reports with SQL Server Reporting Services

Module 11, Managing Report Execution and Delivery

Module 12, Delivering BI with SharePoint PerformancePoint Services

Module 13, Performing Predictive Analysis with Data Mining

Course Materials
The following materials are included with your kit:

Course Handbook A succinct classroom learning guide that provides all the critical technical
information in a crisp, tightly-focused format, which is just right for an effective in-class learning
experience.

Lessons: Guide you through the learning objectives and provide the key points that are critical to
the success of the in-class learning experience.
Labs: Provide a real-world, hands-on platform for you to apply the knowledge and skills learned
in the module.

Module Reviews and Takeaways: Provide improved on-the-job reference material to boost
knowledge and skills retention.

Lab Answer Keys: Provide step-by-step lab solution guidance at your finger tips when its
needed.
MCT USE ONLY. STUDENT USE PROHIBITED
About This Course xix

Course Companion Content on the http://www.microsoft.com/learning/companionmoc/ Site:


Searchable, easy-to-navigate digital content with integrated premium on-line resources designed to
supplement the Course Handbook.

Modules: Include companion content, such as questions and answers, detailed demo steps and
additional reading links, for each lesson. Additionally, they include Lab Review questions and answers
and Module Reviews and Takeaways sections, which contain the review questions and answers, best
practices, common issues and troubleshooting tips with answers, and real-world issues and scenarios
with answers.
Resources: Include well-categorized additional resources that give you immediate access to the most
up-to-date premium content on TechNet, MSDN, Microsoft Press.

Student Course files on the http://www.microsoft.com/learning/companionmoc/ Site: Includes the


Allfiles.exe, a self-extracting executable file that contains all the files required for the labs and
demonstrations.

Course evaluation At the end of the course, you will have the opportunity to complete an online
evaluation to provide feedback on the course, training facility, and instructor.

To provide additional comments or feedback on the course, send email to


support@mscourseware.com. To inquire about the Microsoft Certification Program, send email to
mcphelp@microsoft.com.

Virtual Machine Environment


This section provides the information for setting up the classroom environment to support the business
scenario of the course.

Virtual Machine Configuration


In this course, you will use Microsoft Hyper-V to perform the labs.

The following table shows the role of each virtual machine used in this course:

Virtual Machine Role


20466C-MIA-SQL Application Server

20466C-MIA-DC Domain Controller

Software Configuration
The following software is installed on each VM:

Microsoft Windows Server 2012 R2

Microsoft SQL Server 2014 (on 20466C-MIA-SQL only)

Microsoft SharePoint Server 2012 (on 20466C-MIA-SQL only)

Microsoft Office 2012 (on 20466C-MIA-SQL only)

Microsoft Visual Studio 2013 (on 20466C-MIA-SQL only)


MCT USE ONLY. STUDENT USE PROHIBITED
xx About This Course

Course Files
There are files associated with the labs in this course. The lab files are located in the folder
D:\Labfiles\LabXX on the 20466C-MIA-SQL VM.

Classroom Setup
Each classroom computer will have the same virtual machine configured in the same way.

Hardware Level 6+

Processor: Intel Virtualization Technology (Intel VT) or AMD Virtualization (AMD-V)


Hard Disk: Dual 120 GB hard disks 7200 RM SATA or better (Striped)
RAM: 12GB or higher. 16 GB or more is recommended for this course.
DVD/CD: DVD drive
Network adapter with Internet connectivity
Video Adapter/Monitor: 17-inch Super VGA (SVGA)
Microsoft Mouse or compatible pointing device
Sound card with amplified speakers

In addition, the instructor computer must be connected to a projection display device that supports SVGA
1024 x 768 pixels, 16 bit colors.

Note: For the best classroom experience, a computer with solid state disks (SSDs) is recommended. For
optimal performance, adapt the instructions below to install the 20466C-MIA-SQL virtual machine on a
different physical disk than the other virtual machines to reduce disk contention.

To ensure a satisfactory student experience, Microsoft Learning requires a minimum equipment


configuration for trainer and student computers in all Microsoft Certified Partner for Learning Solutions
(CPLS) classrooms in which Official Microsoft Learning Product courseware are taught.
MCT USE ONLY. STUDENT USE PROHIBITED
1-1

Module 1
Introduction to Business Intelligence and Data Modeling
Contents:
Module Overview 1-1

Lesson 1: Elements of an Enterprise BI Solution 1-2

Lesson 2: The Microsoft Enterprise BI Platform 1-10

Lesson 3: Planning an Enterprise BI Project 1-14

Lab: Exploring a BI Solution 1-21

Module Review and Takeaways 1-25

Module Overview
Business Intelligence (BI) is an increasingly important IT service in many businesses. In the past, BI
solutions were primarily the preserve of large corporations but, as data storage, analytical, and reporting
technologies become more affordable, many small and medium-sized organizations are taking advantage
of BI solutions.

As a SQL Server database professional, you may be required to participate in, or perhaps even lead, a
project that aims to implement an effective enterprise BI solution. Therefore, it is important that you have
a good understanding of the elements that comprise a BI solution, the business and IT personnel typically
involved in a BI project, and the Microsoft products that you can use to implement the solution.

Objectives
After completing this module, you will be able to:

Describe the elements of a typical BI solution.

Select appropriate Microsoft technologies for a BI solution.

Describe key considerations for planning a BI project.


MCT USE ONLY. STUDENT USE PROHIBITED
1-2 Introduction to Business Intelligence and Data Modeling

Lesson 1
Elements of an Enterprise BI Solution
Although theres no single definitive template for an enterprise BI solution, there are common elements
that are typical across most BI implementations. Being familiar with these common elements will help you
identify the key components required for your specific BI solution.

Lesson Objectives
After completing this lesson, you will be able to describe:

The common elements in a typical BI solution.

The role of business data sources in a BI solution.


The role of a data warehouse in a BI solution.

The role played by an Extract, Transform, and Load (ETL) process in a BI solution.

The role played by analytical models in a BI solution.

Reporting and analysis in a BI solution.

Overview of a BI Solution
All BI solutions are designed to take data
generated by business operations, structure it into
an appropriate format for consistent analysis and
reporting, and then use the information gained by
examining the data to improve business
performance. No two BI solutions are identical, but
most include the following elements:

Business data sources. The data that will


provide the basis for business decision making
through the BI solution usually resides in
existing business applications or external data
sources. These may be commercially available
data sets or data exposed by business partner organizations.

A data warehouse. To make it easier to analyze and report on the business as a whole, the business
data is consolidated into a data warehouse. Depending on the size of the organization, and the
specific BI methodology adopted, this may be a single, central database that is optimized for
analytical queries or a distributed collection of data marts, each covering a specific area of the
business.

Extract, Transform, and Load (ETL) processes. To get the business data from the data sources into
the data warehouse, an ETL process periodically extracts data from the source systems, transforms the
structure and content of the data to conform to the data warehouse schema, and loads it into the
data warehouse. ETL processes are often implemented within a wider Enterprise Integration
Management (EIM) framework that ensures the integrity of data across multiple systems through
Master Data Management (MDM) and data cleansing.

Analytical data models. The data warehouse schema is usually optimized for analytical querying
and, in some cases, you may decide to perform all analysis and reporting directly from the data
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 1-3

warehouse. However, it is common to build analytical data models on top of the data warehouse to
abstract the underlying data tables, add custom values such as key performance indicators, and
aggregate data for faster analytical processing.

Reporting. Most BI solutions include a reporting element that enables business users to view reports
containing business information. Most reporting solutions provide a set of standard business reports
that are generated on a regular basis, and some also empower users to perform self-service reporting
to generate their own custom reports. Reports can be created directly from the data warehouse or
from analytical data models built on it, depending on your specific business requirements and
constraints.

Analytical Information Worker Tools. In addition to reports, most BI solutions deliver analytical
information to business users through information worker tools. These tools might be locally-installed
applications, such as Microsoft Excel or interactive dashboards in web-based applications, such as
Microsoft SharePoint PerformancePoint Services.

Business Data Sources


Most businesses use software applications to
process business operations. For example, a retail
business might use a Point-Of-Sale (POS) system
to process sales transactions as customers
purchase goods, and an inventory management
system to perform stock control operations, such
as ordering new stock as goods sell out. Most
organizations also use a Human Resources (HR)
system to manage employee records, and many
sales-oriented businesses use Customer
Relationship Management (CRM) systems to
manage customer contact details and sales
opportunities.

The systems used in an organization might be purpose-built applications or based on simple documents,
such as spreadsheets. In some cases, business operations might be automated by sensors or plant
machinery. Regardless of the specific implementation, all these systems generate some form of business
data and this is the starting point for any BI solution.
Business data sources for a BI solution typically include some or all of the following:

Application databases, often implemented as relational databases in systems such as SQL Server,
Oracle, or Microsoft Access.

Proprietary data stores, such as those used by many commercial financial accounting applications.

Documents such as Excel workbooks.

Sensor readings emitted by plant machinery, which may be captured as a data stream using
technologies such as Microsoft SQL Server StreamInsight.

External data sources such as cloud-based databases or web services.

Master data hubs that contain definitive data values for core business entities.
MCT USE ONLY. STUDENT USE PROHIBITED
1-4 Introduction to Business Intelligence and Data Modeling

One of the first tasks in any BI project is to audit available data sources and try to identify:

The specific data that is stored in each source.

The volume of data currently stored and being generated by ongoing operations.

The data types and range of values for important business data fields.

Business-specific values used to indicate key information. For example, a POS system may use
numeric codes to indicate payment types, such as 0 for cash, 1 for credit, and so on.

Common errors, reliability issues, and missing or null values in the data.

Data duplication or inconsistencies across multiple systems.

Existing data integration processes.

Data source usage patterns and update periodicity.

Technologies that can be used to extract the source data to a staging database.

The Data Warehouse


The data warehouse is the central data repository
on which all reporting and analysis is based.
Typically, it contains numeric business measures
that are important to the business, such as
revenue, cost, or profit. It also contains the key
business entities or dimensions by which those
measures can be aggregated, such as fiscal period,
customer, or product.

Kimball and Inmon Methodologies


Typically, a data warehouse is implemented as a
relational database in which the business data has
been denormalized into a star schema consisting
of fact tables that contain numeric measures and dimension tables that contain attribute hierarchies across
which the measures are aggregated. This approach reflects the dimensional model methodology
promoted by Ralph Kimball, and is the most common approach adopted by Microsoft SQL Server
customers. In the Kimball methodology, the data warehouse may consist of multiple data marts, each
dealing with a specific area of the business. The fact and dimension tables in these data marts are
conformed so that they share the same grain (granularity) and dimension attributes across all data marts.
This enables the data marts to be independent data stores that can be logically viewed as a single
Enterprise Data Warehouse (EDW). The Kimball methodology is often referred to as a bottom-up
approach.

An alternative data warehouse design, popularized by Bill Inmon, is the Corporate Information Factory
(CIF) model where the enterprise data warehouse stores the business data in a normalized relational
schema. This is then used to feed departmental data marts, in which specific subsets of the data are
exposed in a star schema. The dependency of the data marts on a central EDW leads many to refer to the
Inmon methodology as a top-down approach.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 1-5

Common Implementations
Although the Kimball and Inmon methodologies, in their pure form, are designed for BI solutions that
distribute data across multiple departmental data marts, it is common for organizations to begin with a
Kimball-style data mart for a business subset that eventually expands into a single, central data warehouse
database for the entire enterprise. The availability of inexpensive storage and the increasing power of
server hardware mean that a single data warehouse can support a huge volume of data and heavy user
workloads.

In very large enterprises, a federated approach is often used in which a hub-and-spoke architecture
synchronizes departmental data marts with a central enterprise data warehouse.

Note: SQL Server can be used to support both Kimball and Inmon style data warehouse
solutions. In response to the more common use of the Kimball methodology, the SQL Server
database engine has been designed to optimize star-join queries and most documentation about
data warehouse implementation in SQL Server assumes a dimensional model rather than a
normalized EDW. In recognition of these facts, this course focuses on a Kimball-style data
warehouse. However, you should investigate the details of both approaches and consider which
best suits your specific business requirements and constraints.

Extract, Transform, and Load Processes


If the data warehouse is the central brain of the BI
solution, the ETL system is the heart. It pumps
business data through to keep the data warehouse
up to date and ensure that the overall BI solution
continues to deliver value to the business. A
significant proportion of the effort to design and
build an effective BI solution goes into the ETL
sub-system.

ETL Processes in a Data Warehousing


Solution
ETL is about more than just copying data from a
source to the data warehouse. ETL processes must
extract data efficiently, filtering extraction operations to include only new or changed rows wherever
possible. Typically, the extracted data is initially loaded into a staging area. Loads to the data warehouse
tables can then be synchronized across data extracted from multiple sources and performed at the most
appropriate time. Additionally, the ETL process applies transformations to the data before it is loaded, to
ensure that the data values and schema conform to the data warehouse dimensional model. Finally, when
loading the data warehouse, an important part of the ETL process is to handle slowly-changing
dimensions (dimension records that must be updated while retaining historical versions). When loading
large volumes of fact or dimension data, the ETL process must minimize the adverse impact on data
warehouse queries and ensure that data is loaded as quickly as possible.

Another consideration for ETL is the logging strategy that you will use to record ETL activity and provide
troubleshooting information in the event of a failure somewhere in the ETL process.

Enterprise Integration Management


ETL is a subset of a larger framework for managing data known as Enterprise Integration Management
(EIM). Software vendors and database professionals differ on the specific details of the elements that
comprise an EIM solution but, in the SQL Server platform, EIM is generally considered to include:
MCT USE ONLY. STUDENT USE PROHIBITED
1-6 Introduction to Business Intelligence and Data Modeling

ETL capabilities provided by SQL Server Integration Services (SSIS).

Data cleansing and matching capabilities provided by Data Quality Services (DQS).

Master Data Management (MDM) capabilities provided by Master Data Services (MDS).

Using ETL for Application Data Integration


In addition to populating and refreshing the data warehouse, ETL processes can be used to synchronize
data across multiple business applications. For example, to refresh product data in an e-commerce system
from a centrally-managed catalog. To accomplish this, you can use SSIS or other synchronization
technologies such as SQL Server replication. When planning a BI solution in environments where data is
transferred between source systems, it is important to understand the lineage of the data and be aware of
the schedule on which transfers occur.

Managing Data Quality


In any system that relies on user input, there is a risk that the quality of data will be impaired because of
erroneous or duplicate data entry. Although most applications perform a degree of input validation, it is
possible that some data will be invalid (for example, a user may enter New Yrk instead of New York),
inconsistent (for example, one user may enter CA and another California), or duplicated (for example,
an existing customer may re-register on an e-commerce site with a different email address because his or
her password has been forgotten). Data quality technologies, such as DQS, enable you to automate the
identification of invalid or inconsistent column values and duplicate rows. This means source data can be
cleansed and de-duplicated before being loaded into a data warehouse, improving the integrity of the
business analysis and reporting provided by your BI solution.

Master Data Management


In an organization with multiple business applications, it is possible for the same business entity to be
represented in multiple data sources. This presents the challenge of identifying the definitive version of
the entity. For example, an organization might store customer details in a CRM system, an order
processing system, and an e-commerce site profile management system. If the same customer exists in all
three systems and the address data does not match, it is difficult to establish which address on record is
correct. By implementing an MDM system, the organization can establish a definitive master record and
use it to ensure data integrity across all systems.

Analytical Data Models


Most corporate BI solutions include analytical data
models that provide information workers with a
way to slice and dice data warehouse measures
by aggregating them across the dimensions.
Often, these analytical models are referred to as
cubes. Technically, a cube is a specific organization
of measures and dimensions in a multidimensional
data model, but the word cube is commonly
used as a generic term referring to any data model
that enables users to aggregate measures by
business entities.

Benefits of Analytical Data Models


It is possible to create BI solutions that support reporting and analysis directly from tables or views in the
data warehouse. However, in most scenarios, creating a separate analytical data model layer results in the
following benefits:
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 1-7

The data model abstracts the underlying data warehouse tables, which enables you to create models
that reflect how business users perceive the business entities and measures regardless of the data
warehouse table schema. If necessary, you can modify or expand the underlying data warehouse
without affecting the data model used by business users for analysis.

Because the data model reflects the users view of the business, data analysis is easier for information
workers with little or no understanding of database schema design. You can use meaningful names
for tables and fields and define hierarchies based on attributes in dimension tables that make the
data more intuitive for business users.
You can add custom logic to a data model that increases business value when analyzing the data. For
example, you can define Key Performance Indicators (KPIs) that make it easier to compare actual
business measures with targets.
Although the SQL Server database engine can provide extremely high query performance, a data
warehouse typically contains a massive volume of data. Because most analysis involves aggregating
measures across multiple dimensions, the processing overhead for complex queries can result in
unacceptable response times, especially when many users access the data concurrently. A data model
typically pre-aggregates the data, which provides vastly superior performance for analytical queries.

Data models are a common feature in BI solutions and a number of standards have been established.
By creating a data model, you can expose analytical data through a standard interface to be
consumed by client applications, such as Microsoft Excel or third-party analytical tools.

Types of Analytical Data Model


SQL Server 2014 supports two kinds of analytical models:

Multidimensional data models. Multidimensional data models have been supported in every
version of SQL Server Analysis Services since the release of SQL Server 7.0. You can use a
multidimensional data model to create an Analysis Services database that contains one or more
cubes, each providing aggregations of measures in measure groups across multiple dimensions.
MCT USE ONLY. STUDENT USE PROHIBITED
1-8 Introduction to Business Intelligence and Data Modeling

Tabular data models. Tabular data models were first introduced in Microsoft Excel with
PowerPivot in SQL Server 2008 R2 and extended to Analysis Services in SQL Server 2012. For a user
performing analysis, tabular models provide similar functionality to a multidimensional model. In
many cases, the two models are indistinguishable from one another. For BI developers, tabular
models do not require as much Online Analytical Processing (OLAP) modeling knowledge as
multidimensional models, because they are based on relationships between multiple tables of data.

Reporting and Analysis


The primary purpose of a BI solution is to help
organizations track and improve business
performance through reporting and analysis.

Reporting
Reporting is the communication of information
gained from BI. Most organizations rely on reports
to summarize business performance and activities.
Consequently, most BI solutions include an
element that generates reports. Typical examples
are financial and management reports that include
cash flow, profit and loss, balance sheet, and open
orders. A retail business might require stock
inventory reports, whereas a technical support call center might need a report that shows call log data.

In some scenarios, users might need to view reports interactively in a web browser or custom application.
In other instances, the requirement might be to send reports as email attachments in specific formats,
such as Excel workbooks or Word documents. In many cases, the reports might need to be printed, for
example, to send a physical report to customers or shareholders. When planning a reporting solution, you
must consider the reports that are required, the audiences for those reports, and how they will be
delivered.

Regardless of the specific reports that are required, or how they will be distributed and consumed, there
are two common approaches to report generation in most BI solutions:

IT-provided reports. Traditionally, standard business reports are created by a specialist report
developer and automatically generated with current data as requested or on a regular basis. Although
the reports may be developed by a business user with report development skills, they are generally
supported by IT and delivered though the organizations reporting infrastructure.

Self-service reporting. As business users have become more technically proficient and report
authoring tools have become easier to use, many organizations supplement standard reports with the
ability for users to create their own reports with no intervention from IT. For self-service reporting to
be effective, some initial work must be done to design and implement a suitable reporting
infrastructure. After that is in place, users can benefit from the ability to customize the reports they
use without placing an additional burden on the IT department.

Analysis
Analysis is the interpretation of business data delivered by the BI solution. For business analysts in
particular, performing analysis is a discrete activity that involves using specialist analytical tools to examine
data in analytical models. For others, analysis is simply a part of everyday work and involves using reports
or dashboards as a basis for business decision making.
In general, when planning a BI solution, you should consider the following kinds of analytical
requirements:
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 1-9

Interactive analysis. Some BI solutions must support interactive slice and dice analysis in business
tools such as Microsoft Excel or specialist data analysis tools. The resulting information can then be
published as a report.

Dashboards and scorecards. Commonly, analytical data can be summarized in a dashboard or


scorecard and embedded into business applications or portals, such as SharePoint Server sites. These
solutions might provide some interactivity to enable users to drill down into specific details, or they
may simply show important KPIs.

Data mining. Most analysis and reporting concerns historical data, but a BI solution can also support
predictive analysis by using historical data to determine trends and patterns.

Data Sources
You can access data for analysis and generate reports from virtually any data source but, in a BI solution,
reports are commonly based on one of the following data sources:

Analytical data models. If you have created analytical data models in your BI solution, you can use
them as a source for analysis and reports. This approach enables you to benefit from the data models
in your reporting solution, as described in the previous topic.

The data warehouse. You can create analytical reports directly from the data warehouse or a
departmental data mart. This enables you to express queries in Transact-SQL which may be more
familiar to a report developer than a data modeling query language such as MDX or DAX.

Note: Considerations for designing a reporting solution are discussed in more depth later
in this course.
MCT USE ONLY. STUDENT USE PROHIBITED
1-10 Introduction to Business Intelligence and Data Modeling

Lesson 2
The Microsoft Enterprise BI Platform
Microsoft products are used to provide IT infrastructure for organizations all over the world. Therefore, it
makes sense for many of these organizations to consider using the Microsoft BI platform to benefit from
the close integration and common infrastructure capabilities of the various products that can help deliver
an enterprise BI solution.

As a Microsoft BI professional, you need to know which products can be used to implement the various
elements of a BI solution, and how those products can be integrated to work together.

Lesson Objectives
After completing this lesson, you will be able to describe:

The role of Windows Server in a BI solution.

The role of SQL Server in a BI solution.


The role of SharePoint Server in a BI solution.

The role of Office applications in a BI solution.

Windows Server
Microsoft Windows Server is the foundation for
a Microsoft-based enterprise solution and
provides a number of core infrastructure services,
including:

Network communication and management.


Active Directorybased authentication and
security management.

Core application services, such as the


Microsoft .NET Framework and Internet
Information Services (IIS).

Storage management, NTFS disk volumes and


Storage Spaces.

Failover Clustering.

Virtualization.

Windows Server Editions


Windows Server 2012 R2 is available in the following editions:

Windows Server 2012 R2 Datacenter. This edition provides all features of Windows Server and is
optimized for highly virtualized environments.

Windows Server 2012 R2 Standard. This edition provides all features of Windows Server and is
designed for physical or minimally virtualized environments.
Windows Server 2012 R2 Essentials. This edition is designed for small businesses with up to 25
users and 50 client devices.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 1-11

Windows Server 2012 R2 Foundation. This edition is designed for environments with up to 15
users.

Note: Although Windows Server 2012 R2 includes comprehensive infrastructure


management tools, large enterprises might consider also using Microsoft System Center 2012
R2 products to manage enterprise infrastructure.

SQL Server
Microsoft SQL Server 2014 provides the core
data services for a BI solution. These services
include:

The SQL Server database engine, which is used


for application databases, operations
databases, and the data warehouse
throughout the BI solution.

SQL Server Integration Services (SSIS), which is


used as the primary platform for ETL
processes.

Data Quality Services (DQS), which provides


data cleansing and matching capabilities.

Master Data Services (MDS), which provides master data management capabilities.

SQL Server Analysis Services (SSAS), which provides a storage and query processing engine for
multidimensional and tabular data models.
SQL Server Reporting Services (SSRS), which provides a platform for publishing and delivering reports
that users can consume through a native web-based interface or have delivered by subscriptions.

SQL Server 2014 Editions


SQL Server 2014 is available in the following core editions:

SQL Server 2014 Enterprise. You should use this edition for data warehouses and BI solutions that
require advanced SSIS features, such as fuzzy logic and Change Data Capture (CDC) components.
SQL Server 2014 Business Intelligence. You should use this edition for servers hosting SSIS, DQS,
and MDS. You should also use this edition for SSRS and SSAS solutions that require more than 16
processor cores or if you need to support tabular data models, PowerPivot for SharePoint, Power
View for SharePoint, or advanced data mining.

SQL Server 2014 Standard. You can use this edition for solutions that require basic SSRS reporting,
SSAS multidimensional models, and basic data mining.

Note: SQL Server 2014 is also available in Web and Express editions, but these are generally
not appropriate for BI solutions. A special edition of SQL Server named Parallel Data Warehouse
(PDW) provides support for Massively Parallel Processing (MPP) data warehouse solutions, but
this edition is only available pre-installed on an enterprise data warehouse appliance from
selected Microsoft hardware partners.
MCT USE ONLY. STUDENT USE PROHIBITED
1-12 Introduction to Business Intelligence and Data Modeling

SharePoint Server
Microsoft SharePoint Server 2013 provides
enterprise information sharing services through
collaborative websites. SharePoint Server provides
the following BI capabilities:

Excel Services. Users can view and interact


with Excel workbooks that are shared in a
SharePoint document library through a web
browser. This includes workbooks that use
data connections to query data in a data
warehouse or Analysis Services data model.

PowerPivot Services. Users can share and


interact with Excel workbooks that contain a
PowerPivot tabular data model. This enables business users to create and share their own analytical
data models.

Integration with SSRS. You can deliver and manage reports and data alerts through SharePoint
document libraries instead of the native Report Manager interface provided with SSRS.

Power View. Power View is an interactive data visualization technology through which users can
graphically explore a data model in a web browser.
PerformancePoint Services. PerformancePoint Services enables BI developers to create dashboards
and scorecards that deliver KPIs and reports through a SharePoint site.

Office Applications
Microsoft Office 2013 and Office 365 provide
productivity applications that business users can
use to consume and interact with BI data. These
applications include:

Microsoft Excel. Excel is the most


commonly used data analysis tool in the
world, and can be used to:

o Import data from a data warehouse and


use it to create charts and reports.

o Create interactive PivotTables and


PivotCharts from analytical data models in
SSAS or PowerPivot.

o Create PowerPivot workbooks that contain tabular data models.

o Create Power View visualizations from tabular models.

Microsoft Word. Word is a document-authoring tool. In a BI scenario, users can export SSRS
reports in Word format and use Words editing and reviewing tools to enhance them.

Microsoft PowerPoint. PowerPoint is a widely-used presentation tool. Users can save Power View
visualizations as PowerPoint presentations, and present business data in a dynamic, interactive format.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 1-13

Microsoft Visio. Visio is a diagramming tool that can be used to visualize data mining analyses.

Note: Microsoft Office 2013 and Microsoft Office 365 include comprehensive support
for self-service BI, including data discovery, data modeling, and data visualization tools. The focus
of this course is on building enterprise BI solutions with SQL Server, with minimal discussion of
self-service BI scenarios. For a more detailed exploration of how to implement self-service BI with
Microsoft office technologies, consider attending course 20467C: Designing Self-Service Business
Intelligence and Big Data Solutions.
MCT USE ONLY. STUDENT USE PROHIBITED
1-14 Introduction to Business Intelligence and Data Modeling

Lesson 3
Planning an Enterprise BI Project
Statistics show that a surprisingly high number of BI projects fail in organizations throughout the world.
Often projects are abandoned before completion, fail to deliver all the originally-specified deliverables, or
simply do not deliver a solution that adds value to the business. In many cases, the fundamental cause of
failure is that the project was insufficiently envisioned or that key stakeholders were not included in the
planning.
Careful planning can help to ensure that a BI project runs smoothly with a successful outcome. By
applying some common best practices, you can increase the likelihood that your BI project will not be
added to the long list of failures.

Lesson Objectives
After completing this lesson, you will be able to:
Describe key features of a BI project.

Identify infrastructure commonly used in a BI project.

Identify common project personnel in a BI project.


Describe the role of business stakeholders in a BI project.

Plan the scope of a BI project.

BI Project Overview
There are numerous frameworks for planning and
managing IT projects, and many organizations
have a policy to use a specific approach when
implementing a new solution. Whichever
approach you use, a BI project must start with the
business requirements, using these to inform the
design of the overall technical architecture, the
data warehouse and ETL, along with the reporting
and analysis provided by the solution.

Business Requirements
The most important consideration when planning
a BI project is that the core purpose is to improve
the business. Most IT projects require a deep understanding of technology but, in a BI project, you must
also have detailed knowledge of how various business processes work and interact with one another, and
what the commercial aims are.

Understanding the overall structure, processes, and goals of the business makes it easier to gather,
interpret, and prioritize requirements for the BI solution. Typically, BI requirements are about quantifying
core business metrics across various aspects of the business in order to measure performance and inform
decisions. For example, there may be a requirement for the solution to enable sales managers to see
monthly sales revenue by salesperson in order to reward success and identify employees needing
additional support or motivation.

There might be a requirement to view quarterly order amounts by product line to plan more efficient
manufacturing based on demand trends. Only after you have identified the specific business requirements
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 1-15

for your BI solution can you start considering the design of the infrastructure, data warehouse and ETL
solution, and analytical reports.

Technical Architecture and Infrastructure Design


With a good understanding of the business requirements, you can start planning the overall solution
architecture. You can identify the required elements of the BI solution (as described in the first lesson of
this module) and consider the software products you want to use to implement those elements (as
described in the second lesson in this module).

After selecting your technologies, you can start to design the infrastructure for the BI solution, including
server hardware and configuration, security, and high availability considerations.

Data Warehouse and ETL Design


Business requirements determine the data that the BI solution must include. Specifically, these are the
numeric measures that users need to aggregate (for example, revenue or profit), and the business
dimensions across which they must be aggregated (for example, salesperson or product line). When you
have identified the data needed to meet the business requirements, you can start designing the data
warehouse in which that data will be stored and the ETL process populating and refreshing the data
warehouse from the business applications where the data currently resides.

Reporting and Analysis Design


In many cases, business requirements for a BI solution are expressed as specifications for reports or
analytical data sets. Business users often describe the reports they need to consume, the dashboards they
want to view, or the PivotTables they want to create. Using business requirements to identify the
information that users require from a BI solution helps you engage with them to refine your
understanding of how they want to consume or interact with that information. This will enable you to
design an appropriate solution for analysis and reporting.

Monitoring and Optimization


Performance optimization is a consideration in the design and implementation of all elements of the BI
solution, and your project planning should include consideration of how performance of the overall
solution will be monitored and optimized as the volumes of data and users grow. In particular, you must
consider how you will measure performance, what expectations users have for performance, and how to
identify performance degradation.

Operations and Maintenance


When planning a BI solution, it is easy just to focus on the functional requirements. However, it is
important to also consider operational requirements and factor them into the design. Operational
requirements for a BI solution include a number of ongoing tasks, such as index maintenance in the data
warehouse, backup procedures for all databases and data stores used in the solution, scheduled
automation of ETL tasks, data model processing, logging and auditing, and many others.
MCT USE ONLY. STUDENT USE PROHIBITED
1-16 Introduction to Business Intelligence and Data Modeling

Project Infrastructure
It is easy to focus on the infrastructure
requirements of a proposed solution and overlook
the infrastructure required to actually build it. In
the same way that a construction project for an
office building requires a site office, parking
facilities for the construction crew, and so on, a BI
project requires hardware and software resources
for the project team to use during the solution
development.

Project Management Infrastructure


From the beginning, a project requires
infrastructure to enable team members to
communicate, collaborate, and document project planning information. Examples of this kind of
infrastructure include:

Office productivity applications.


Microsoft Project.

A SharePoint Server site for the project.

Design and Development Tools


When designing the BI solution, the team will need tools such as:

Microsoft Visio to support diagrammatic design.

Microsoft Visual Studio with the SQL Server Data Tools for BI to develop data models, reports, and
SSIS packages, as well as other Visual Studio components to develop custom application components.

Team Foundation Server (TFS) to provide source control and issue tracking capabilities.

Development and Test Infrastructure


In addition to development tools, the project will require servers on which to develop and test the
elements of the solution. These servers will require the same software used in the production solution, for
example, SQL Server and SharePoint Server. The test servers should be configured as similarly as possible
to the envisioned production infrastructure. However, considering the complexity of some enterprise-
scale BI solutions, you may choose to simplify the test environment, for example, by provisioning a single-
server installation of SharePoint Server instead of a multi-server farm, combining SQL Server components
on a single server instead of dedicated servers, and using standalone servers instead of failover clusters.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 1-17

Project Personnel
A BI project involves several roles which include:

A project manager. Coordinates project tasks


and schedules, ensuring that the project is
completed on time and within budget.

A BI solution architect. Has overall


responsibility for the technical design of the
data warehousing solution.

A data modeler. Designs the data warehouse


schema and analytical data models.

A database administrator. Designs the


physical architecture and configuration of the
data warehouse database. Database administrators with responsibility for data sources used in the
data warehousing solution must also be involved in the project to provide access to the data sources
used by the ETL process.

An infrastructure specialist. Implements the server and network infrastructure for the data
warehousing solution.

An ETL developer. Builds the ETL workflow for the data warehousing solution.

A report developer. Creates the reporting elements of the BI solution.


Business users. Provide requirements and help to prioritize the business questions that the data
warehousing solution will answer. Often, the team includes a business analyst as a full-time member
to help interpret the business questions and ensure that the solution design meets users needs.
Testers. Verify the business and operational functionality of the solution as it is developed.

Note: The list in this topic is not exhaustive but represents roles that must be performed. In some
cases, multiple roles may be performed by a single person though, in general, you should avoid
having testers validate their own development work.
In addition to the technical project personnel listed here, the project team should include
business stakeholders from the beginning of the planning phase. The roles performed by
business stakeholders are discussed in the next topic.

Business Stakeholders
The previous topic described the technical roles
required in a BI project. However, the project team
should also include representatives from key areas
of the business to help ensure that the solution
meets business requirements and to promote user
acceptance.

Executive Sponsor
The culture of each organization is unique but, in
almost all businesses, a BI project will face
MCT USE ONLY. STUDENT USE PROHIBITED
1-18 Introduction to Business Intelligence and Data Modeling

personality clashes and political obstacles that must be navigated to create a solution that is in the best
interests of the business as a whole. Employees tend to focus on their specific areas of the business and
can often be resistant to changes that affect their day-to-day activities or to what they see as external
interference.

The challenge of obtaining buy-in from business users is easier to overcome if there is an executive
sponsor who has aligned project goals with the strategic aims of the business and can champion the
project at the organizations highest level. When the BI project team meets resistance or contradictory
views from business users, the executive sponsor can use his or her influence to resolve the issue.

Business Users
Although executive sponsorship is essential to drive the project forward, it is important to take into
account the input from business users. A solution enforced on users without consultation is unlikely to be
accepted. In most cases, it is unlikely that the primarily technical members of the project team have
sufficient business knowledge to create a useful solution even if users could be persuaded to accept it.

Businesses are complex ecosystems in which many processes interact to achieve multiple objectives. In
some organizations, business processes are formally defined and documented but, even when this is the
case, it is likely that day-to-day activities vary, often significantly, from official practices. Generally,
business users have a better insight into how the processes actually work, what the various data elements
used in those processes actually mean, and how important they are, than can be gained by a technical
architect examining existing systems and their documentation.

For example, suppose an existing sales processing system includes a data field named SZ_Code with
values such as STD-140 and SPC-190. Usage of this field is not listed in the application documentation,
yet you see it is used in approximately 75 percent of sales orders. Only a business user who is familiar with
the sales order process could tell you that the field represents a size code for products that are available in
multiple sizes, and that the value STD-140 represents a standard size of 140 centimeters, whereas SPC-190
means that the product was ordered in a special-order size of 190 centimeters that had to be custom
made.

Data Stewards
Some information workers have particularly detailed knowledge of the business processes and data in a
specific area. By formally including these people in the BI project team, they can adopt the role of data
steward (sometimes referred to as data governor or data curator) for the data elements used in their area
of the business. A data steward can provide valuable services to the project, including:

Representing the interests of a specific business area while the BI solution is planned. For example,
ensuring that all data elements that are important to that business area are included in the data
warehouse design and that reports required by that business area are considered.

Validating and interpreting data values in the source systems that will be used to populate the data
warehouse, and helping to identify the appropriate transformations and exceptions that will need to
be implemented.

Taking ongoing responsibility for maintaining a Data Quality Services (DQS) knowledge base for the
business area, so that data values can be cleansed and matched effectively.

Taking ongoing responsibility for maintaining relevant business entities in a Master Data Services
(MDS) model to ensure consistency of data definitions across the organization.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 1-19

Project Scope
From the beginning of a project, it is important to
prioritize the business requirements in terms of
their value to the business, and the feasibility of
meeting them with specific constraints, such as
available data, budget, and project deadlines. This
enables you to scope the project to maximize the
chances of it successfully delivering value to the
business.

Initial Scoping
After gathering the initial requirements, the
project team and business stakeholders must
negotiate their importance or value. At this stage,
you may be able to judge the feasibility of meeting some objectives but others will require further
investigation to identify suitable source data or to estimate the effort required.

You can use a matrix to record the relative value and feasibility of each requirement as they are agreed by
team members. It is likely that there will be disagreements about the importance of some objectives and
feasibility may not be easy to assess. In these cases, you should make a note of the issues and move on. At
this stage, it is important to get a comprehensive view of the potential project scope. Further repetition of
the design process will gradually resolve prioritization conflicts and help clarify feasibility.

Refining the Scope


After the initial scoping discussion, the feasibility of the identified requirements can be investigated.
Typically, this involves:

Using the techniques for auditing data sources discussed in the first lesson of this module to
determine whether sufficient data is available and accessible to meet the requirements.

Estimating the development effort and skills required for each of the requirements.

As investigations reveal more information, the team should meet to refine the matrix created during the
initial scoping exercise.

Identifying a Pilot Scope


When the scope is considered to be well defined, the team should examine the requirements in the high
value, high feasibility quadrant of the matrix and further prioritize them to determine a set of
requirements that can be addressed in a pilot or proof-of-concept solution.

Using a pilot project enables you to reduce the time it takes for the BI project to add value to the
business. By prioritizing the requirements based on their value and feasibility, you can quickly
demonstrate the effectiveness of the BI initiative without losing the momentum the project has built up
during the initial scoping phase. In most cases, the pilot focuses on a related set of requirements, often in
a specific, high-profile area of the business. However, because you have used the scoping phase to
consider all requirements, you can design the pilot with extensibility in mind, ensuring that its design will
support the addition of other highly important business requirements later.
MCT USE ONLY. STUDENT USE PROHIBITED
1-20 Introduction to Business Intelligence and Data Modeling

After scoping the pilot, you can start designing the solution. However, you must make sure that the
project team carefully considers the following questions:

How will the pilot incorporate User Acceptance Testing (UAT)? Instead of delivering the solution to all
users in the affected area of the business, you may want to enroll a subset of users in the pilot
program with the aim of providing feedback on the usability and usefulness of the solution. Often,
these users can provide valuable feedback that results in improvements to the design of reports, data
models, dashboards, SharePoint document library structures, and other user-visible aspects of the
solution.
How will you measure the success of the pilot? Other than qualitative measures based on feedback
from users, you should consider quantitative goals. The criteria for success should ultimately be
aligned with the business goals, so you should measure the effects of the solution in terms of revenue
growth, increased profitability, reduced costs, increased customer satisfaction survey scores, or
whatever quantifiable goal the BI solution is intended to help the business achieve. Therefore, you
should determine a realistic time interval over which the success of the project should be assessed.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 1-21

Lab: Exploring a BI Solution


Scenario
Adventure Works Cycles is a multinational corporation that manufactures and sells bicycles and cycling
accessories. The company sells its products through an international network of resellers and, in recent
years, has developed a direct sales channel through an e-commerce website.

The company is financially sound and has a strong order book but sales volumes have remained relatively
static for the past few years. Senior management is under pressure from shareholders to develop a growth
strategy that will drive increased revenue and profit. Management believes that a key factor in their
growth strategy is investment in technology that improves collaboration between the various divisions of
the company, and enables them to track and share key business performance metrics.

Objectives
After completing this lab, you will be able to:

Identify SQL Server 2014 technologies in a BI architecture.


Analyze data and consume reports.

Estimated Time: 45 Minutes

Virtual machine: 20466C-MIA-SQL


User name: ADVENTUREWORKS\Student

Password: Pa$$w0rd

Exercise 1: Exploring the Data Warehouse


Scenario
Adventure Works Cycles has a data warehouse containing data relating to Internet and reseller sales. This
data warehouse will form the foundation for an enterprise BI solution to include SQL Server Analysis
Services data models and a SQL Server Reporting Services solution for reports.

The main tasks for this exercise are as follows:

1. Prepare the Lab Environment

2. Explore the Data Warehouse Schema

3. Query the Data Warehouse

Task 1: Prepare the Lab Environment


1. Ensure that the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then
log on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. Run Setup.cmd in the D:\Labfiles\Lab01\Starter folder as Administrator.

Task 2: Explore the Data Warehouse Schema


1. Use SQL Server Management Studio to connect to the MIA-SQL SQL Server instance.

2. Create a database diagram for the AdventureWorksDW database, and note the following details
about its schema:

o The FactInternetSales table stores details of sales orders made through the Adventure Works
website. The table is related to the DimCustomers table, which contains details of the customers
who have placed orders.
MCT USE ONLY. STUDENT USE PROHIBITED
1-22 Introduction to Business Intelligence and Data Modeling

o The FactResellerSales table stores details of sales orders made to resellers. The table is related to
the DimResellers table, which contains details of the resellers who have placed orders.

o The DimDate table stores values for dates, including details of the calendar and fiscal periods in
which individual dates occur.

o The FactInternetSales and FactResellerSales tables are related to the DimDates table by
multiple fields. These fields represent the order date (the date the order was placed), the due
date (the date the order was expected to be in stock), and the ship date (the date the order was
shipped to the customer).

o Both the DimCustomer and DimReseller tables are related to the DimGeography table, which
stores details of geographical locations. The DimSalesTerritory table is also related to
DimGeography.

o The DimProduct table contains details of products, and is related to the


DimProductSubcategory table, which is in turn related to the DimProductCategory table.

o Many tables include multiple language values for the same data value, for example, the
DimDates table stores the English, French, and Spanish words for each month name.

o The DimEmployee table is related to itself to represent the fact that each employee has a
manager, who is also an employee.

3. Save the diagram as AdventureWorksDW Schema.

Task 3: Query the Data Warehouse


1. Open the Query DW.sql script file in the D:\Labfiles\Lab01\Starter folder.

2. Execute the query under the comment Internet sales by year and month, and note the following:
o The data warehouse contains Internet sales orders from July 2005 to July 2008. Reseller sales in
the data warehouse are also recorded for this time period.

o You can use the MonthNumberOYear column in the DimDate table to sort month names into
chronological order without this field, it would be difficult (though not impossible) for
reporting clients to sort months other than alphabetically. A similar field named
DayNumberOfWeek can be used to sort weekday names into chronological order.

3. Execute the query under the comment Geographical reseller sales, and note the following:

o In 2005, Adventure Works only sold to resellers in the United States and Canada.

o In 2006, this was expanded to include France and the United Kingdom.
o In 2007, resellers in Australia and Germany were added.

o By contrast, Adventure Works has sold directly to Internet customers in all of these regions since
2005.

4. Execute the query under the comment Sales by product category, and note the following:

o Adventure Works sells four categories of product: Accessories, Bikes, Clothing, and Components.

o Components are only sold to resellers not to Internet customers.

o Accessories were not sold to Internet customers until 2007.

Results: At the end of this exercise, you will have explored the data warehouse for the BI solution, and
used Transact-SQL queries to explore the data it contains.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 1-23

Exercise 2: Exploring the Analysis Services Data Model


Scenario
Having examined the data warehouse, you will now explore an Analysis Services Data Model that is based
on the data warehouse tables.

The main tasks for this exercise are as follows:

1. View an Analysis Services Database

2. Create a PivotTable in Excel


3. Filter the PivotTable

Task 1: View an Analysis Services Database


1. In SQL Server Management Studio, connect to the MIA-SQL instance of Analysis Services.

2. Explore the Adventure Works OLAP database in the MIA-SQL Analysis Services server. Note that it
contains a data source named Adventure Works Data Warehouse, which connects to the
AdventureWorksDW database you explored in the previous exercise.

3. Browse the cube named Internet Sales, to view the Sales Amount measure by Order Date Fiscal
Year. Note that in Adventure Works, fiscal years run from July to June, so fiscal year 2006 represents
the period from July 2005 to June 2006.

4. Close SQL Server Management Studio when you are finished.

Task 2: Create a PivotTable in Excel


1. Create a new blank Excel workbook.

2. Import data from the MIA-SQL SQL Server Analysis Services server:

o Select the Internet Sales cube in the Adventure Works OLAP database.

o Import the data into a PivotTable report.

3. In the PivotTable, display the Sales Amount measure. Change the settings of this value field to
format it using the Accounting format.
4. Add the Order Date.Calendar Date hierarchy to the rows of the PivotTable, and explore the data by
drilling down from years to semesters, semesters to quarters, quarters to months, and months to days.

5. Save the workbook as Sales.xlsx in the D:\Labfiles\Lab01\Starter folder.

Task 3: Filter the PivotTable


1. Add a slicer based on the English Product Category Name field in the Product Category hierarchy
to the PivotTable you created in the previous task.

o Note that the Components category is disabled because there are no sales of components to
Internet customers.

2. Use the filter to show only sales for the Accessories category.

o Note that there were no accessories sold in 2005 or 2006.


3. Clear the filter, and then save and close the workbook.

Results: At the end of this exercise, you will have used Excel to explore an analytical data model built on
the data warehouse.
MCT USE ONLY. STUDENT USE PROHIBITED
1-24 Introduction to Business Intelligence and Data Modeling

Exercise 3: Exploring Reports


Scenario
Executives and managers in Adventure Works rely on reports to keep up to date with how the business is
performing. In this exercise you will view some prototype reports to help you better understand the
business data and reporting requirements.

The main tasks for this exercise are as follows:

1. View Reports

2. Export a Report

Task 1: View Reports


1. In Internet Explorer, browse to the Adventure Works Portal SharePoint Server site at
http://localhost/sites/adventureworks.

2. In the Reports document library, view the Sales Trends report.

o Note that this report shows historic Internet sales by product category.

o Change the Calendar Year parameter to view sales in 2007.

o Expand the categories and months to see sales details.

3. In the Reports document library, view the US Sales By State report. Note that this report shows
annual sales revenue in the United States on a map.
4. Return to the Reports document library when you have finished.

Task 2: Export a Report


1. In the Reports document library, view the Sales Report report.

2. Hide the Parameters pane and note that the report includes a chart that shows monthly sales for
each product category.

3. Use the Next Page button at the top of the report to view the subsequent pages. Note that details
for monthly sales are displayed on each page.
4. Export the report to Excel and save the resulting workbook in the D:\Labfiles\Lab01\Starter folder.
Then open the workbook and note the following:

o The workbook includes a summary worksheet containing a chart, and a worksheet for each
month.

o On the month worksheets, you can use native Excel functionality to expand and collapse the data
groupings.
5. In Internet Explorer, export the report to Word and save the resulting document in the
D:\Labfiles\Lab01\Starter folder. Then open the document and note the following:

o The first page of the report shows the chart and that the sales details for each month start on a
new page.

o This report has been designed to identify when it is being rendered in a non-interactive format
and automatically expand the data groupings.

Results: At the end of this exercise, you will have viewed data in reports and exported a report to Excel
and Word formats.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 1-25

Module Review and Takeaways


In this module, you have learned about the common elements of a BI solution and how Microsoft
software products can help you implement them. You have also learned about the key roles and scoping
tasks in a BI project.

Review Question(s)
Question: In your experience, what are the factors that affect the success or failure of a BI
project?
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
2-1

Module 2
Creating Multidimensional Databases
Contents:
Module Overview 2-1

Lesson 1: Introduction to Multidimensional Analysis 2-3

Lesson 2: Creating Data Sources and Data Source Views 2-9

Lesson 3: Creating a Cube 2-13

Lesson 4: Overview of Cube Security 2-17

Lab: Creating a Multidimensional Database 2-23

Module Review and Takeaways 2-28

Module Overview
The fundamental purpose of using Microsoft SQL Server Analysis Services (SSAS) Online Analytical
Processing (OLAP) solutions is to build cubes that you can use to perform complex queries and return the
results in a reasonable time. Typically, you use SQL Server Data Tools for BI Visual Studio add-in to
develop cubes.
Multidimensional databases, which are also known as OLAP cubes, connect to relational databases by
using connection information in data sources and data source views. With the addition of dimensions and
measures, they make it possible to navigate aggregated data.
For example, a simple cube could have a Sales Revenue measure and Time, Product, and Customer
dimensions. You can navigate the dimensions to find the total sales revenue for the previous month, a
specific customer, and a particular product.

This organization of data has many benefits including:

Measures are pre-aggregated, enabling extremely fast queries compared to relational databases.

Multidimensional structures are logical and straightforward to navigate for information workers who
use tools such as Microsoft Excel.

It is possible to create perspectives of the cube to focus on key areas of data and make the pertinent
facts even more accessible.

This module provides an introduction to multidimensional databases and initiates the core components of
an OLAP cube.

Objectives
After completing this module, you will be able to:

Describe the considerations for a multidimensional database.

Create data sources and data source views.


Create a cube.
MCT USE ONLY. STUDENT USE PROHIBITED
2-2 Creating Multidimensional Databases

Implement security in a multidimensional database.


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 2-3

Lesson 1
Introduction to Multidimensional Analysis
Multidimensional analysis using OLAP requires the creation of OLAP cubes alongside data sources and
data source views. This lesson describes data sources, data source views, and the core components of a
cube: dimensions, measures, and measure groups.

Lesson Objectives
After completing this lesson, you will be able to:

Identify the business problems that multidimensional analysis addresses.

Describe the core concepts of multidimensional analysis.


Define a data source.

Define a data source view.

Describe a dimension.

Define measures and measure groups.

Create an OLAP cube.

Describe the advantages of using SSAS as an OLAP platform.


Identify the software requirements for multidimensional analysis.

The Business Problem


Most database systems are designed for Online
Transaction Processing (OLTP). Many small
updates and inserts are happening in real time
and data is stored at the most granular level
possible. This level of detail is required and is
useful for many operations, but there are several
instances when these systems are not the best.

For example, Adventure Works sells bicycles and


associated accessories. It is vital that every item
from every sale is recorded and these updates to
the database must happen almost instantaneously.
The problem occurs when the trading manager
requires a broad overview of sales for the past month by product category. The query to generate this
data involves filtering all sales to find only those that occur in the past month then grouping them by
product category before calculating the total sales by product group. Not only is this query very slow, but
it also causes locking and contention on the database, which slows down its primary role as a
transactional system.

OLTP systems have several limitations for use as a source of data for reporting and analysis:

Aggregated spanning queries are slow to generate.

Large queries can cause performance issues for transactional workloads.

Normalized relational data models are not intuitive for business users.
MCT USE ONLY. STUDENT USE PROHIBITED
2-4 Creating Multidimensional Databases

The Concepts of Multidimensional Analysis


Multidimensional analysis stores the data from a
data warehouse in a multidimensional cube. The
cube consists of measures - the values that are
aggregated - and dimensions, which are the axes
of the cube and give the measures their context.

For example, a sales cube might have Sales


Revenue and Units Sold measures, as well as
Time, Product, and Customer dimensions.

The dimensions enable you to drill down through


different layers of data by using a hierarchy. For
example, you can initially view sales by year, then
drill down to sales by month, and then by day.
You can also filter by using slicing and dicing dimension values:

Slicing. Involves using a member of one dimension to provide a slice of the cube. For example, you
could slice a cube using a particular product and view all sales of that product across all dates and
customers.

Dicing. Involves providing values for every dimension to locate a single value for a cube. For example,
you could look for sales of a particular product on a particular day to a particular customer.
One key difference with a relational system, or the new tabular data model, is processing. The
aggregations of measures are calculated when the cube is processed. This enables you to undertake
complex queries when the system has performance capacity, perhaps at night. When business users run
the queries, results are extremely fast because the complex processing has already been done.

SQL Server Analysis Services provides a platform for multidimensional databases containing OLAP cubes.

Data Sources
A data source is an object that provides a SQL
Server Analysis Services database with the
information needed to connect to a data source
for the cubes it contains. Analysis Services can
access data from one or more data source.
Typically, the source database for an OLAP cube is
a data warehouse or a departmental data mart, in
which the tables have been denormalized to
reflect a star or snowflake schema of fact and
dimension tables. However, SQL Server Analysis
Services can use virtually any database as a data
source and abstract the underlying data model
with a data source view.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 2-5

At a minimum, a data source includes an identifier, a name, and a connection string. The connection
string used to access the source data specifies the following information:

The provider name.

The information needed to connect to the data source by using the specified provider. The property
settings for particular data source objects vary according to the provider.

Other properties that the provider supports or requires.

Data Source Views


A data source view is used to define specific tables
that are retrieved from the data source. It can also
change the presentation of those tables and their
columns to make it more user friendly to Analysis
Services users, for example by changing column
names.
A data source view contains the following items:

A name and a description.

A definition of any subset of the schema


retrieved from one or more data sources, up
to and including the whole schema, including
the following:
o Table names

o Column names

o Data types

o Nullability

o Column lengths

o Primary keys
o Primary key/foreign key relationships

Enhancements to the schema from the underlying data sources, including the following:

o Friendly names for tables, views, and columns.

o Named queries that return columns from one or more data sources (that show as tables in the
schema).

o Named calculations that return columns from a data source (that show as columns in tables or
views).

o Logical primary keys (needed if a primary key is not defined in the underlying table or is not
included in the view or named query).

o Logical primary key/foreign key relationships between tables, views, and named queries.

You can create multiple diagrams showing different subsets of the data source view. Unlike different data
source views, different diagrams reference the same schema. Therefore, any changes you make in one
diagram apply to all the others in the data source view. Diagrams are useful when you create multiple
cubes within the same solution.
MCT USE ONLY. STUDENT USE PROHIBITED
2-6 Creating Multidimensional Databases

Dimensions
Dimensions provide the context for facts stored in
the OLAP cube. If you visualize an OLAP cube as a
physical cube, the dimensions are the axes. The
dimension has a key attribute that enables SSAS to
join records to related data in the fact table.

Dimensions are hierarchical, not flat. Therefore, a


time dimension typically has years, months, and
days while a geography dimension typically has
country, state, and town. A dimension can have
more than one hierarchy, enabling you to drill
down, for example, from year to month to day.
Another hierarchy will take you from year to week
to day.

Dimensions can also be unbalanced and ragged. An unbalanced dimension does not have the same
number of levels in every branch. For example, the Sales Director reports to the Chief Executive Officer
(CEO) and has three levels of staff below her. The Executive Assistant also reports to the CEO but has no
levels below him. A ragged dimension is a particular type of unbalanced dimension where there are
missing or duplicated levels in the hierarchy. For example, in a hierarchy that contains Continent, Country
or Region, State, and City, the city-state of Monaco would either have the values of Europe, Monaco,
Monaco, Monaco, or the values of Europe, Monaco, NULL, NULL.

Measures and Measure Groups


There is a particular dimension called Measures.
Measures are the facts aggregated at each
intersection of dimension members. They are
typically numeric and additive, and are
aggregated by using a sum calculation, although
there are measures that do not meet these criteria.

Measures are also part of a dimension because


there are often multiple measures in a cube. For
example, you could select the state of California
from the Geography dimension, the year 2010
from the Time dimension, and the Sales Units
measure from the Measures dimension. You could
then modify this by changing to the Sales Amount measure from the Measures dimension.

You can use a measure expression to define the value of a measure. This enables you to apply calculations
to measure values or use multiple columns from a fact table in a measure expression.

You can also use attribute columns from dimension tables to define a measure. These attributes would not
typically contain facts or be additive, but could provide useful information. For example, you could use
them to provide a count of members.

Measures are grouped based on their underlying fact tables. Measure groups are used to define the
relationship between dimensions and measures. When a dimension or a measure group is added to the
cube, SSAS will examine the relationship that exists between the fact table and the dimension table in the
data source and automatically set the Measure Group information in the Dimension Usage tab of the
Cube Designer. You can then modify this to your requirements.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 2-7

Cubes
A cube is the unit that defines dimensions and
measures and is the users point of access to the
multidimensional data.

The term cube is used because it describes the


multidimensional nature of the data, as opposed
to the one-dimensional or two-dimensional types
of relational tables. However, almost all cubes
have more than three dimensions.

Cubes bring together all the functionality of


dimensions and measures and enable you to
display this in a user-friendly interface such as an
Excel PivotTable table. The PivotTable interface
with SSAS enables you to display aggregated measure data and slice and dice it by using dimension
members.

Cubes contain all the settings that define storage, aggregation levels, and processing frequency.
By providing a multidimensional structure and the ability to employ a user-friendly interface such as Excel,
cubes are highly intuitive for business users to navigate.

SQL Server Analysis Services as an OLAP Platform


SSAS provides a service in which you can host
multidimensional databases. These databases
contain the data sources, data source views,
dimensions, measures, and cubes that business
users need to perform multidimensional analysis.
SSAS is a scalable enterprise OLAP solution. It
optimizes performance by using innovative
techniques such as block computation and is
highly scalable due to tools like the Aggregation
Design Wizard.

SSAS provides you with a wide range of tools for


your OLAP data, including Key Performance
Indicators (KPIs), multilingual translation capabilities, and data mining analysis of data. This functionality
will be covered in more detail in subsequent modules.

Microsoft provides intuitive front-end tools through Microsoft Excel and Microsoft SharePoint
Server, enabling business users to quickly access the data they need with little or no knowledge of the
underlying system.
MCT USE ONLY. STUDENT USE PROHIBITED
2-8 Creating Multidimensional Databases

Software Requirements for Multidimensional Analysis


Multidimensional analysis is supported in SQL
Server Standard, Business Intelligence, Enterprise,
Developer, or Evaluation edition. To install SSAS
for multidimensional analysis, your server must
meet the same software prerequisites as the SQL
Server database engine:

Windows Server 2008 Service Pack 2 (SP2)


or R2 (64-bit) or later

SQL Server 2014 Standard, Business


Intelligence, Enterprise, Developer, or
Evaluation Edition
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 2-9

Lesson 2
Creating Data Sources and Data Source Views
OLAP and data mining projects in SSAS are designed based on a logical data model of related tables,
views, and queries from one or more data sources. This logical data model is called a data source view.
Data source views enable you to define a subset of the data that populates a large data warehouse.

Lesson Objectives
This lesson describes how to create and modify a data source.

After completing this lesson, you will be able to describe how to:

Create a data source.


Create a data source view.

Describe how to modify a data source view.

Creating a Data Source


You can use the Data Source Wizard in Visual
Studio SQL Server Data Tools for BI to define one
or more data sources for an SSAS project. Data
sources use a Connection object that defines the
connection string. In the first step of the Data
Source Wizard, you must define the Connection
object or use an existing Connection object.

The default provider for a new connection is the


Native OLE DB\SQL Server Native Client provider.
Other supported providers include:

SQL Server 7.0 using the SQL OLE DB provider


or the .NET native OLE DB provider.

SQL Server 2000 using the SQL OLE DB provider or the .NET native OLE DB provider.

SQL Server using the SQL OLE DB provider or the .NET native OLE DB provider.
Oracle using the Oracle OLE DB provider or the OracleClient.NET native OLE DB provider.

Microsoft Access with the Microsoft Office 12.0 Access Database Engine OLE DB provider.

Teradata v2R6 with the OLE DB 1.3 provider from NCR (x86 only).

If you have an existing data source defined in an SSAS database or project and want to create a new data
source object that connects to the same underlying data source, you can simply copy properties of the
first data source object into a new version.
MCT USE ONLY. STUDENT USE PROHIBITED
2-10 Creating Multidimensional Databases

Creating a Data Source View


SSAS design tools use data source views to
maintain a cache of relational metadata and take
advantage of some of the annotations within a
data source view. Advantages of this approach
include:

A data source view makes available only


tables that OLAP and data mining objects
require by describing a subset of tables and
views in a data source.
A data source view handles the layout of
tables, filters, Structured Query Language
(SQL) expressions, relationships, and other
complexities of the schema. Therefore, a data source view enables simple bindings by SSAS cubes,
dimensions, and mining models to the tables and columns in the data source view.

You can define a data source view by using the Data Source View Wizard in SQL Server Data Tools.
One of the benefits of data source views is that they can bring together data from multiple data
sources, by specifying secondary data sources in the Data Source View Designer. You can also use the
Data Source View Designer to browse source data.

Modifying a Data Source View


After you have created a data source view, you can
modify it to make the data more user-friendly.
Often, underlying data sources use naming
conventions to help developers understand the
schema. Also, the underlying data does not
typically store additional formatting information.
Using the Data Source View Designer, you can
make modifications to a data source view without
modifying the underlying source data or database
schemas. You can then present user-friendly
names and formatting without making any
changes to the source database systems. For
example, you can:
Use the FriendlyName property to specify a column name from a table or view that is easier for users
to understand or more relevant to the subject area.

Create a named query and named calculations that enable you to extend the relational schema of
existing tables in a data source view without modifying the underlying data source.

Create logical primary keys and relationships for improved performance.

Create diagrams to reduce the visual clutter when you only want to view a subset of the tables in the
data source view.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 2-11

Demonstration: Creating a Data Source and a Data Source View


In this demonstration, you will see how to:

Create a new Analysis Services project.

Create a Data Source.


Create a Data Source View.

Modify a Data Source View.

Demonstration Steps
Create an Analysis Services Project

1. Ensure that the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then
log on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. In the D:\Demofiles\Mod02 folder, run Setup.cmd as Administrator.

3. Start Visual Studio, and on the File menu, point to New, and click Project.

4. In the New Project dialog box, in the Business Intelligence templates section, create a new
Analysis Services Multidimensional and Data Mining Project named Demo in the
D:\Demofiles\Mod02 folder.

Create a Data Source

1. In Solution Explorer, right-click Data Sources, and then click New Data Source.

2. On the Welcome to the Data Source Wizard page, click Next.

3. On the Select how to define the connection page, click New.

4. In the Server name field, type localhost.

5. In the Select or enter a database name field, click AdventureWorksDW, and then click OK.

6. Click Next.
7. Click Use a Specific Windows user name and password, and enter the user name
ADVENTUREWORKS\ServiceAcct and the password Pa$$w0rd. Then click Next.

8. In the Data source name field, type Adventure Works Demo DS, and then click Finish.

Create a Data Source View

1. In Solution Explorer, right-click Data Source Views, and then click New Data Source View.

2. On the Welcome to the Data Source View Wizard page, click Next.
3. Ensure that Adventure Works Demo DS is selected, and then click Next.

4. Select the DimDate, DimProduct, DimProductCategory, DimProductSubcategory, DimReseller,


DimSalesTerritory, and FactResellerSales tables, and then add them to Included objects. Then
click Next.

5. In the Name field, type Adventure Works Demo DSV, and then click Finish.

Modify a Data Source View

1. If it is not already open, in Solution Explorer, right-click Adventure Works Demo DSV, and then click
Open.

2. In Data Source Designer, click the table name of the FactResellerSales table and press F4.

3. In the Properties pane, change the FriendlyName property to Reseller Sales.


MCT USE ONLY. STUDENT USE PROHIBITED
2-12 Creating Multidimensional Databases

4. Repeat the previous steps to rename the DimSalesTerritory table to Sales Territory.

5. Right-click the DimProduct table, and then click New Named Calculation.

6. In the Column name field, type Product Size.

7. In the Expression field, type [Size] + ' ' + [SizeUnitMeasureCode], and then click OK.

8. Right-click the DimProduct table and click Explore Data. Then scroll down until you see a value in
the Product Size column (the last column in the table) that shows the size and measurement unit.

9. Close the Explore DimProduct Table window. Then on the File menu, click Save All.

10. Keep Visual Studio open. You will return to it in a later demonstration.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 2-13

Lesson 3
Creating a Cube
A cube is a multidimensional structure containing dimensions and measures. Dimensions define the
structure of the cube and measures provide the numerical values of interest to the end user. As a logical
structure, a cube enables a client application to retrieve values as if cells in the cube defined every
possible summarized value.

Lesson Objectives
This lesson describes how to create and modify cubes, including in some considerations for cube design.
After completing this lesson, you will be able to:

Create a multidimensional cube.

Describe the considerations for time dimensions.

Modify a cube by using the Cube Designer.

Browse a multidimensional cube.

Options for Creating a Cube


Use the Cube Wizard to create a cube quickly and
easily:

The Cube Wizard shows you how to specify


the data source view and measures in the
cube.

When you create the cube, you can add


existing or create new dimensions that
structure the cube.

You can also create dimensions separately,


using the Dimension Wizard, and then add to
a cube.

You can create a cube by choosing the relevant data source and data source view. Alternatively, you
can build the cube without using a data source and subsequently generate the database schema.
Another option is to create the cube and enable SSAS to create the underlying schema supporting
the cube.
You can use the Cube Wizard to automatically build attributes and hierarchies, or you can choose to
define them in the Cube Designer later.
MCT USE ONLY. STUDENT USE PROHIBITED
2-14 Creating Multidimensional Databases

Considerations for Time Dimensions


Most cubes include a time dimension, although
some solutions do not use time, opting instead for
another factor, such as a unit of work or project, to
delineate data.

Time sometimes requires multiple hierarchies. For


example, you could have a hierarchy that has
Years, Weeks, and Days and another with Years,
Months, and Days, but not one with Years,
Months, Weeks, and Days because Weeks do not
fit into Months. Therefore, you need two
hierarchies to represent this. You can create
multiple time dimensions or just one with multiple
hierarchies. In the previous example, you would typically create one time dimension with multiple
hierarchies but, if you require a time dimension for a different purpose, such as Fiscal Time, you could
create it as a separate dimension.

Time dimensions are distinct from other dimensions because SSAS contains inherent functionality to
group the members into levels. SSAS has built-in logic that puts correct days into correct months and
correct months into the correct quarter. This is the only dimension type that behaves like this because, for
example, SSAS has no way of detecting which product should be assigned to each product group.

You can either use a Server Time dimension or a dimension table containing time data. A Server Time
dimension avoids joins and can be faster to process. However, many time dimensions include additional
data, for example, corporate time information such as quarters, which do not follow calendar quarters or
public holidays. A time dimension table enables you to add detail to the time members, which allows you
to store additional properties for each date.

Server time dimensions contain hierarchies, levels, and attributes but these are stored on the server rather
than in a separate dimension table.

The Cube Designer


Although the Cube Wizard creates the cube, many
of its properties will be set by using the Cube
Designer. The Cube Designer in SQL Server Data
Tools enables you to view and edit the cubes
properties and its objects, and browse cube data:
You can modify cube structure, dimension
usage, and calculations.

You can add KPIs and actions, change cube


partitioning and storage, create aggregations,
create perspectives on cube data, and add
translations to localize cube data.
You can also browse the cube, to see how modifications will affect it.

Note: Creating and modifying a cube is covered in more detail in Module 3, Working with
Cubes and Dimensions.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 2-15

Browsing a Cube
SSAS enables you to browse the cube from within
SQL Server Data Tools. This means you can test the
cube without having to switch between a client
application and SQL Server Data Tools. You can
also change the security context and browse the
cube as another user. Before browsing cube data
in the Cube Browser tab, the cube must be
deployed.

You can then click the Browser tab from within


the Cube Designer to browse the cube and see its
measures aggregated by dimensions.

You can also click the Analyze in Excel icon on


the toolbar in the Browser tab to create an Excel workbook that connects to the cube. This enables you
to see how the cube behaves when used as a source for an Excel PivotTable.

Demonstration: Creating and Browsing a Cube


In this demonstration, you will see how to:

Create a Cube.
Deploy and Browse a Cube.

Customize Cube Measures and Dimensions.

Demonstration Steps
Create a Cube

1. Ensure that you have completed the previous demonstration in this module.

2. Maximize Visual Studio, which should be open with the Demo project loaded.

3. In Solution Explorer, right-click Cubes, and click New Cube.

4. On the Welcome to the Cube Wizard page, click Next.

5. On the Select Creation Method page, ensure that Use existing tables is selected, and click Next.

6. On the Select Measure Group Tables page, click Suggest. Then note that the wizard identifies
DimReseller and Reseller Sales as potential measure group tables. Clear DimReseller so that only
Reseller Sales is selected, and click Next.
7. On the Select Measures page, clear the Reseller Sales checkbox to clear all selections, and then
select the Total Product Cost and Sales Amount measures. Then click Next.

8. On the Select New Dimensions page, clear the Reseller Sales dimension, and click Next.

9. On the Completing the Wizard page, change the cube name to SalesDemo and click Finish. The
cube is created and opened in the cube designer.

Deploy and Browse a Cube

1. In Solution Explorer, right-click Demo and click Deploy. If prompted, enter the password Pa$$w0rd
for the ADVENTUREWORKS\ServiceAcct user. Then wait for the deployment process to finish.

2. In the cube designer, click the Browser tab.


MCT USE ONLY. STUDENT USE PROHIBITED
2-16 Creating Multidimensional Databases

3. In the Metadata pane, expand Measures and expand Reseller Sales. The wizard has created the
measures you selected.

4. Drag the Sales Amount measure to the Drag levels or measures here to add to the query area.
The total sales amount for all reseller sales is shown.

5. In the Metadata pane, note that the wizard has created dimensions for the DimReseller and
DimProduct tables. It has also determined that there are three relationships defined between the
Reseller Sales table and the DimDate table, and so has a dimension for each related column (Due
Date, Order Date, and Ship Date).

6. Expand each dimension and note the wizard has only created attributes for key columns. You will
need to modify the dimensions to add meaningful attributes for analysis.

7. Drag the Product Key attribute from the Dim Product dimension and drop it to the left of the Sales
Amount value in the grid. The browser now shows the aggregated sales amount totals for each
product key.

Customize Measures and Dimensions

1. In the cube designer, click the Cube Structure tab. Then, in the Measures pane, expand Reseller
Sales.

2. Right-click the Total Product Cost measure and click Rename. Then change the name of the
measure to Cost.

3. In the Dimensions pane, right-click Dim Reseller, click Rename and change the name of the
dimension to Reseller. Then rename Dim Product to Product.
4. Right-click the Product dimension and click Edit Dimension. This opens the dimension designer for
the Dim Product dimension (the dimension is named Product in the SalesDemo cube, but still
named Dim Product in the database).

5. In the Data Source View pane, in the DimProduct table, drag the EnglishProductName field to the
Attributes pane.

6. In the Attributes pane, right-click English Product Name and click Rename. Then rename the
attribute to Product Name.

7. In Solution Explorer, right-click Demo and click Deploy. If prompted, enter the password Pa$$w0rd
for the ADVENTUREWORKS\ServiceAcct user. Then wait for the deployment process to finish.

8. Click the SalesDemo.cube [Design] tab to return to the cube designer, and then, in the cube
designer, click the Browser tab.

9. On the Browser tab, click the Reconnect button in the upper left. If the grid contains any data from
previous queries, right-click in the grid and click Clear Grid.

10. In the Metadata pane, expand Measures and Reseller Sales, and drag Cost to the Drag levels or
measures here to add to the query area.

11. Expand the Product dimension, and then drag Product Name to the left of the Cost value. The total
cost associated with sales for each product is shown.

12. On the File menu, click Save All.

13. Keep Visual Studio open. You will use it again in a later demonstration.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 2-17

Lesson 4
Overview of Cube Security
After you install an instance of SSAS, only members of the server role have server-wide permissions to
perform any task within it. By default, no other users have access permissions to the objects in the
instance. Members of the SSAS server role can grant other users access to server and database objects by
using Microsoft SQL Server Management Studio, SQL Server Data Tools, or an XML for Analysis (XMLA)
script.

Lesson Objectives
After completing this lesson, which describes the SSAS security model, you will be able to:

Describe the SSAS security model.

Describe the server role.

Describe database roles.


Explain cube permissions.

Describe cell permissions.

Describe dimension security.

Test cube security.

Analysis Services Security Model


SQL Server Analysis Services relies on Windows
to authenticate users. After a user has been
authenticated, SSAS controls permissions within
the databases based on the users role
membership. For example:

Principals are entities that can request SQL


Server resources.

Securables are the resources to which the SQL


Server Database Engine authorization system
regulates access.

SSAS has a single fixed server role for


administrators, with members of this role having full permissions to perform any action in the
instance.

Administrators are able to create user database roles that can be granted user access and database
administrator rights. Role permissions are database-specific.
You can grant database role permissions for database and cube dimensions, individual dimension
members, cubes, individual cells within a cube, mining structures, mining models, data sources, and
stored procedures.
MCT USE ONLY. STUDENT USE PROHIBITED
2-18 Creating Multidimensional Databases

Adding Members to the Server Role


The server role is fixed role at server level. You
cannot change its permissions or add further
server roles.

The server role gives the administrator access to


SSAS and members of this role have access to all
SSAS databases and objects on that instance of
SSAS. To give lower-level administrative rights, you
should consider adding users to a database role
and granting the relevant permissions.

Use SQL Server Management Studio to add


members to the server role on the Security page
of the Analysis Server Properties dialog box.

Creating Database Roles


To assign permissions to users or administrators
who do not have server-wide administrative rights,
you should create database roles.

You should first decide what levels of access are


required within your organization, including basic
users through power users, managers, and
department administrators. Remove groups with
duplicated permissions and, where possible,
consolidate groups with similar permissions as
long as this does not give any sensitive permission
to another group. After you have defined the
groups, you can then create them and add users.

You can create database roles by using SQL Server Management Studio or the Visual Studio SQL Server
Data Tools add-in:

In SQL Server Management Studio, in Object Explorer, expand the database, and then right-click the
Roles folder to add a new role.

In Visual Studio, the Roles folder is displayed in Solution Explorer. You can also add a role from the
Project menu.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 2-19

Granting Data Source and Cube Permissions


To access data within the database, users must be
members of a role with permissions to access the
cube. Permissions to access a data source are not
required to access cube data:

Users do not typically need access to data


sources.
You can grant roles with read or read/write
permission to cubes within the database. By
default, a role with permissions for the cube
has:

o Access to the individual cells within the


cube.

o Read permissions on the dimensions and dimension members that the cube contains.

Granting Cell Permissions


A role that has read or read/write permissions is
granted permissions on individual cells until you
apply cell security. However, by applying cell
permissions, you can override the default
permissions that are inherited from cube
permissions. By default, cell permissions apply to
all cube cells but, if you grant access to one or
more cells, the role is denied access to all others
until you grant further permissions. Use a
Multidimensional Expressions (MDX) statement to
limit the range of cells to which the permissions
apply.

Note: MDX statements are discussed in more detail in Module 5, Introduction to MDX.

Read and read/write access are self-explanatory but there is also read-contingent access that gives you
permissions on a calculated cell only if you have permissions on all cells referenced in the calculation. For
example, the Sales Manager role has read access to the Sales Value measure but no access to the
Product Cost measure. The Sales Manager role has also been granted read access to the Profit calculated
measure. The Profit measure is Sales Value minus Product Cost. With read access, Sales Manager role
members can see the results of the Profit measure and, because they also have access to the Sales Value
measure, can infer the Product Cost measure even though they have no direct access. If you apply read-
contingent access to the Profit measure, users can only see the results if they have permission to view the
Sales Value and Product Cost data.
MCT USE ONLY. STUDENT USE PROHIBITED
2-20 Creating Multidimensional Databases

Granting Dimension and Dimension Data Permissions


By default, a role with access to a cube has read
access to all of the cube dimensions. You can
modify this and deny access to one or more
members in the dimension.

The default access is read-only, so users cannot


process the cube. However, if you change the
permissions of a relevant role to read/write, it will
enable users to process the cube.

By default, permissions apply to all dimension


members, but you can write an MDX statement to
define a subset of the dimension.

Note: MDX statements are discussed in more detail in Module 8, Introduction to MDX.

You can use an MDX statement for an allowed member set or a denied member set. You can also define a
default member to control which member set users see when they first connect to the cube. You can also
define this by using MDX to create versatile default members. For example, you could use an MDX
statement to set the current calendar year as the default member of a time dimension.

Testing Security Permissions


After you have configured security for an SSAS
database, it is important to test security settings
and permissions. You can test permissions, without
logging off from your Windows session, through
the Visual Studio SQL Server Data Tools for BI add-
in. This enables you to test and modify security:

In the Cube Designer, on the Browser tab,


you can change a user to an individual or to a
role to test security by clicking the Change
User button.

In Role Designer, on the Cell Data tab, you


can click the Test cube security hyperlink.
This has the advantage that it saves objects if necessary and defaults to the current role.

You may also need to test security from SQL Server Management Studio, or you may want to test a client
application such as Excel. You can use the Windows Run as feature to start an application such as SQL
Server Management Studio, and test administrative permissions like those granted through membership
in the server role or the database-level Full Control (Administrator) permission.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 2-21

Demonstration: Applying Cube Security


In this demonstration, you will see how to:

Create a Role.

Test Permissions.

Demonstration Steps
Create a Role

1. Ensure you have completed the previous demonstration in this module. Visual Studio should be open
with the Demo project loaded.

2. In Solution Explorer, right-click Roles, and click New Role. Then in Solution Explorer, right-click the
Role.role icon that has been added, click Rename, and change the name to Restricted User.role.
When prompted to change the object name as well, click Yes.

3. In the role designer, on the General tab, note that by default this role has no database permissions.

4. On the Membership tab, note that this is where you can add Windows users and groups to the
role.

5. On the Data Sources tab, note that by default the role has no access to any data sources.

6. On the Cubes tab, change the Access value for SalesDemo to Read. This allows the role to access
the cube and read its data.

7. On the Cell Data tab, note that this is where you can specify MDX statements that define cells in the
cube which you want to permit or deny this role to access.

8. On the Dimensions tab, note that the role has Read access to all dimensions in the database. Then in
the Select Dimension Set list at the top of the page, click SalesDemo cube dimensions and note
that these permissions are inherited by the dimensions in the Demo Cube cube.

9. On the Dimension Data tab, in the Dimension list, under SalesDemo, select Measures Dimension
and click OK. Then on the Basic tab, clear the checkbox for the Cost measure.

10. In the Dimension list, under SalesDemo, select Sales Territory and click OK. Then on the Basic tab,
in the Attribute hierarchy list, select Product Name.

11. Select Deselect all members. Then scroll to the bottom of the list of product names and select only
the products that begin with the word Touring.

12. On the File menu, click Save All.

Test Permissions

1. In Solution Explorer, right-click Demo and click Deploy. If prompted, enter the password Pa$$w0rd
for the ADVENTUREWORKS\ServiceAcct user.

2. When deployment is complete, in Solution Explorer, double-click SalesDemo.cube to return to the


cube designer.
3. On the Browser tab, click the Reconnect button at the top left. If the grid contains any data from
previous queries, right-click in the grid and click Clear Grid.

4. In the Metadata pane, expand Measures and Reseller Sales, and drag Sales Amount and Cost to
the Drag levels or measures here to add to the query area. Then expand the Product dimension
and drag Product Name to the left of the measure values. The browser shows cost and sales amount
for all products.

5. Right-click in the grid and click Clear Grid.


MCT USE ONLY. STUDENT USE PROHIBITED
2-22 Creating Multidimensional Databases

6. In the top left area of the browser window, click the Change User button. Then, in the Security
Context Demo Cube dialog box, select Roles, and in the drop-down list, select Restricted User
and click OK. Click OK again to return to the browser window.

7. Note that the grid has been cleared. Then, in the Metadata pane, expand Measures and Reseller
Sales, and drag Sales Amount to the Drag levels or measures here to add to the query area. Note
that members of this role cannot access the Cost measure.

8. Expand the Product dimension and drag Product Name to the left of the measure values. The
browser only shows sales amount for the Touring products.
9. On the File menu, click Save All, and then close Visual Studio.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 2-23

Lab: Creating a Multidimensional Database


Scenario
Business analysts at Adventure Works Cycles have requested an analytical solution that enables them to
slice and dice business data to analyze Internet sales. To accomplish this, you will create a
multidimensional database that includes an OLAP cube.

Objectives
After completing this lab, you will be able to:

Create a data source.

Create and modify a data source view.

Create and modify a cube.

Add a dimension to a cube.

Estimated Time: 30 minutes

Virtual machine: 20466C-MIA-SQL

User name: ADVENTUREWORKS\Student

Password: Pa$$w0rd

Exercise 1: Creating a Data Source


Scenario
Information workers want to analyze Internet sales data in Excel. To support this, you plan to implement a
multidimensional database. You first need to create a connection and data source to the data warehouse.

The main tasks for this exercise are as follows:

1. Prepare the Lab Environment


2. Create an Analysis Services Project

3. Create a Data Source

Task 1: Prepare the Lab Environment


1. Ensure that the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then
log on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. Run Setup.cmd in the D:\Labfiles\Lab02\Starter folder as Administrator.

Task 2: Create an Analysis Services Project


1. Use Visual Studio to create a new Analysis Services Multidimensional and Data Mining project.

2. Name the project Adventure Works OLAP, and then save it in the D:\Labfiles\Lab02\Starter folder.

Task 3: Create a Data Source


1. Use the Data Source Wizard in Visual Studio to create a data source for the AdventureWorksDW
database on the local server.

2. Use Windows authentication to connect to the data source.

3. Configure the data source to use the ADVENTUREWORKS\ServiceAcct account with the password
Pa$$w0rd.

4. Name the data source Adventure Works Data Warehouse.


MCT USE ONLY. STUDENT USE PROHIBITED
2-24 Creating Multidimensional Databases

Results: After this exercise, you should see the Adventure Works Data Warehouse.ds data source in the
Data Sources folder.

Exercise 2: Creating and Modifying a Data Source View


Scenario
The AdventureWorksDW database includes a large amount of data not required by your users and the
tables use naming conventions that are confusing to business analysts. You need to define which tables
you require and then change their names to be more user-friendly. Users have also requested that they
can retrieve full names of customers in addition to their first, middle, and last names.

The main tasks for this exercise are as follows:

1. Create a Data Source View

2. Modify a Data Source View

Task 1: Create a Data Source View


1. Use the Data Source View Wizard in Visual Studio to create a data source view for the Adventure
Works Data Warehouse data source you created in the previous task.

2. Include the following tables:

o DimCustomer

o DimDate

o DimGeography

o DimProduct

o DimProductCategory

o DimProductSubCategory

o FactInternetSales

3. Name the data source view Adventure Works DSV.

Task 2: Modify a Data Source View


1. Change the FriendlyName property of the tables in the Adventure Works DSV data source view to
remove the Dim or Fact prefix and add spaces between words where appropriate.

o For example, the FactInternetSales table should have the friendly name Internet Sales, and the
DimProductCategory table should have the friendly name Product Category.

2. Add a calculation named Full Name to the Customer table. Use the following MDX expression:

CASE
WHEN MiddleName IS NULL THEN
FirstName + ' ' + LastName
ELSE
FirstName + ' ' + MiddleName + ' ' + LastName
END

Results: After this exercise, you have created a data source view named Adventure Works DSV.dsv.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 2-25

Exercise 3: Creating and Modifying a Cube


Scenario
Now you have created a data source view, you are ready to create a cube that enables business users to
perform multidimensional analysis on the data. You will then modify the cube to give the measures in the
cube user-friendly names to make analysis simpler. You will also add extra attributes to dimensions to
provide users with multiple ways to aggregate the measures.

You need to modify dimension attributes used for dates in your cube to ensure uniqueness across
temporal periods. For example, the month of January is not unique because it occurs in every year. You
must modify the month attribute so that its key is based on both month and year, and then specify which
of these key columns should be used when displaying the attribute name.
The main tasks for this exercise are as follows:

1. Create a Cube

2. Edit Measures

3. Edit Dimensions

4. Browse the Cube

Task 1: Create a Cube


1. Use the Cube Wizard in SQL Server Data Tools to create a new cube from existing tables.
2. Use the Internet Sales table as the measure group, and select the following measures:

o Order Quantity

o Total Product Cost


o Sales Amount

o Internet Sales Count

3. Create dimensions for all tables other than Internet Sales.

4. Name the cube Adventure Works Cube.

Task 2: Edit Measures


1. Change the name of the Order Quantity measure to Internet Order Quantity.

2. Change the name of the Total Product Cost measure to Internet Cost.

3. Change the name of the Sales Amount measure to Internet Revenue.

4. Save the cube.

Task 3: Edit Dimensions


1. Modify the Customer dimension in Dimension Designer.

o Add the City, StateProvinceName, EnglishCountryRegionName, and PostalCode attributes


from the Geography table to the dimension.

o Add the CustomerAlternateKey, Title, FirstName, MiddleName, LastName, and Full Name
attributes from the Customer table to the dimension.

2. Modify the Product dimension in Dimension Designer.

o Add the ProductAlternateKey, EnglishProductName, and ListPrice attributes from the


Product table to the dimension.
MCT USE ONLY. STUDENT USE PROHIBITED
2-26 Creating Multidimensional Databases

o Add the EnglishProductSubcategoryName attribute from the ProductSubcategory table to


the dimension.

o Add the EnglishProductCategoryName attribute from the ProductCategory table to the


dimension.

3. Modify the Date dimension in Dimension Designer.

o Add the EnglishMonthName, MonthNumberOfYear, CalendarQuarter, CalendarYear, and


CalendarSemester attributes from the Date table to the dimension.

o Rename the English Month Name attribute to Month.

Task 4: Browse the Cube


1. Deploy the Adventure Works OLAP project, entering the user name
ADVENTUREWORKS\ServiceAcct and the password Pa$$w0rd if prompted.

2. Use the cube browser in Visual Studio to view the Sales cube, and view the Internet Revenue and
Internet Sales Count measures to the data area by Order Date.Calendar Year.

Results: After this exercise, you should have successfully created and deployed a cube named Sales.cube.

Exercise 4: Adding a Dimension


Scenario
After you have created the cube with the wizard, a business analyst has requested the ability to view
Internet sales by sales territory. To accommodate this, you must create a dimension in the database, and
add it to the cube.

The main tasks for this exercise are as follows:

1. Add a Table to a Data Source View

2. Create a Dimension

3. Add a Dimension to a Cube


4. Analyze a Cube in Excel

Task 1: Add a Table to a Data Source View


1. Edit the Adventure Works DSV data source view and add the DimSalesTerritory table.

2. Change the friendly name of the DimSalesTerritory table to Sales Territory.


3. Save the data source view.

Task 2: Create a Dimension


1. Add a dimension to the database based on the Sales Territory table you added in the previous task.

2. Use SalesTerritoryKey as the key column and the name column.


3. Add the SalesTerritoryRegion, SalesTerritoryCountry, and SalesTerritoryGroup attributes and
enable browsing for each of them.

Task 3: Add a Dimension to a Cube


1. Add the Sales Territory dimension to the Sales cube.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 2-27

2. Verify that the Sales Territory dimension is used by the Internet Sales measure group through a
relationship based on the Sales Territory Key attribute.

3. Save the cube.

Task 4: Analyze a Cube in Excel


1. Deploy the Adventure Works OLAP project, entering the user name
ADVENTUREWORKS\ServiceAcct and the password Pa$$w0rd if prompted.

2. Analyze the cube in Excel, and view the Internet Revenue measure by the Sales Territory Group,
Sales Territory Country, and Sales Territory Region, attributes of the Sales Territory dimension.

3. Close Excel without saving the workbook.

4. Save the project and close Visual Studio.

Results: At the end of this exercise, your database and cube should contains a Sales Territory dimension.
MCT USE ONLY. STUDENT USE PROHIBITED
2-28 Creating Multidimensional Databases

Module Review and Takeaways


In this module, you have learned how to create multidimensional Analysis Services database projects. You
then learned how to define a data source, a data source view, how to create a cube, and how to add
dimensions to a cube after it has been created.

Review Question(s)
Question: What considerations should you take into account when assigning friendly names
for objects in a multidimensional database?
MCT USE ONLY. STUDENT USE PROHIBITED
3-1

Module 3
Working with Cubes and Dimensions
Contents:
Module Overview 3-1

Lesson 1: Configuring Dimensions 3-2

Lesson 2: Defining Attribute Hierarchies 3-8

Lesson 3: Sorting and Grouping Attributes 3-14

Lab: Defining Dimensions 3-16

Module Review and Takeaways 3-24

Module Overview
In Microsoft SQL Server Analysis Services, dimensions are a fundamental component of cubes. The
facts, or measures, provide the values that interest you, but dimensions provide the business context by
which these measures are aggregated. Dimensions organize your data by category and show results
grouped by these categories, such as product, customer, or month.

Analysis Services dimensions contain attributes that correspond to columns in dimension tables. These
attributes appear as attribute hierarchies and can be organized into user-defined hierarchies, or defined
as parent-child hierarchies, based on columns in the underlying dimension table. Hierarchies are used to
organize measures contained in a cube.

Objectives
This module provides an insight into the creation and configuration of dimensions and dimension
hierarchies.

After completing this module, you will be able to:

Configure dimensions.

Define attribute hierarchies.

Sort and group attributes.


MCT USE ONLY. STUDENT USE PROHIBITED
3-2 Working with Cubes and Dimensions

Lesson 1
Configuring Dimensions
All Analysis Services dimensions are groups of attributes based on columns from tables or views in a data
source view. Dimensions can exist independently of a cube, be used in multiple cubes, be used multiple
times in a single cube, and be linked between Analysis Services instances. A dimension existing
independently of a cube is called a database dimension and an instance of a database dimension within a
cube is called a cube dimension.

Lesson Objectives
After completing this lesson, you will be able to:

Describe the concepts of a dimension.

Edit dimensions with the Dimension Designer.

Describe the dimension storage options.


Describe the use of dimension attributes.

Define the columns used to connect dimensions to the fact table, to display to users, and to use in
Multidimensional Expressions (MDX) calculations.

Describe dimension types.

Describe role-playing dimensions.

Dimension Concepts
Dimensions form the contexts for facts, and define
the aspects of the business by which facts are
aggregated. For example, Product and Time
dimensions might provide context to the Sales
Revenue measure.
Dimensions are collections of attributes from
tables or views. You can use dimensions in
multiple cubes and link to them from remote
instances.

Attributes add meaning to dimensions. Each


column in the dimension table can provide an
attribute providing a piece of information about
the dimension member. For example, a Retail Store dimension might have Manager and Postal Code
attributes.

The Key attribute is typically the primary key of the dimension table. The Key attribute is the column in
the dimension table that links to the fact table. The Name attribute provides a friendly name for the
dimension, enabling you to display an alternative, more relevant, name than the source table name-to-
end users.

Attributes are typically arranged into hierarchies that define the drill-down paths through aggregations.
Each layer of the hierarchy is called a level. For example, a Time dimension might have a hierarchy with
Day, Month, and Year levels, but other hierarchies could be defined with attributes such as Week,
Quarter, and Century.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 3-3

By default, every attribute defined within a dimension is part of a two-level hierarchy known as an
attribute hierarchy. The top level is named All, the second is the Leaf level that contains values for each
individual member of the attribute. While they can help in browsing, typically the data within an attribute
hierarchy can be too large and difficult to navigate. Attribute hierarchies are, therefore, often replaced
with user defined hierarchies where you can define more levels and make data more manageable. For
example, a Date attribute hierarchy might have an All level and a second level containing all the dates
within the dimension. To make this more manageable, you could replace the Date attribute hierarchy
with a user defined hierarchy named Calendar Year that contains Year, Quarter and Month levels,
making the data easier to navigate.

Analysis Services can add dimension intelligence to some dimensions. For example, time dimensions have
a fixed structure and this can be used by Analysis Services. It knows which months are in each quarter,
which days are in each month, and so on.

The Dimension Designer


Use the dimension designer to edit the attributes,
levels, hierarchies, and translations of a dimension,
and to browse the dimension. The dimension
designer includes four tabs:

Use the Dimension Structure tab to view and


edit the attributes, levels, and hierarchies of
the dimension.

Use the Attribute Relationships tab to


create, modify or delete the attribute
relationships of the selected dimension.

Use the Translations tab to view and edit the


multi-language translations for the attributes, levels, and hierarchies.
Use the Browser tab to browse members of each hierarchy in the dimension. You can only browse
members after you have deployed the solution.

Configuring Dimension Storage


The two dimension storage modes in Analysis
Services are Multidimensional Online Analytical
Processing (MOLAP) and Relational Online
Analytical Processing (ROLAP). These storage
modes define where, and in what structure type,
the dimension data is stored:

Data for a dimension that uses MOLAP is


stored in a multidimensional structure in the
instance of Analysis Services. This
multidimensional structure is created and
populated when the dimension is processed.
MOLAP dimensions provide better query
performance than ROLAP dimensions.
MCT USE ONLY. STUDENT USE PROHIBITED
3-4 Working with Cubes and Dimensions

Data for a dimension that uses ROLAP is stored in tables used to define the dimension. The ROLAP
storage mode can be used to support large dimensions without duplicating large amounts of data,
but at the expense of query performance. Because the dimension relies directly on the tables in the
data source view used to define the dimension, the ROLAP storage mode also supports real-time
ROLAP.

Configuring Dimension Attributes


The Cube Wizard and Dimension Wizard create
attributes for a dimension. Other wizards and
elements of the Analysis Services user interface
may further modify these attributes. These default
settings are sufficient in most situations, but you
can use Dimension Designer to edit the attributes.

You can remove attributes from the dimension by


right-clicking the attribute and clicking Delete.
Removing attributes from a dimension does not
affect the data source view, so this can be used by
multiple dimensions without forcing each
dimension to use all the same attributes.

You can rename an attribute to provide a more meaningful or user-friendly name than the dimension
table. Do this by right-clicking the attribute in the Dimension Structure tab of the dimension designer.
You can also set the Name property of the attribute in the Properties window or edit it directly if the
Attributes pane is in the grid view.
Having many attributes for a dimension can confuse users. You can organize attributes into display folders
to simplify browsing. Using folders only affects the way client applications display the dimension and has
no other effects on hierarchies or attributes. After you have created display folders, deploy the solution
and reconnect to see the results on the Browser tab.

Attributes can be used to create hierarchies. For example, a Product dimension might have Product
Category, Product Sub-Category, and Product Name attributes. Hierarchies are covered in more detail in
the next lesson.

Attributes can also be used to add additional detail to dimension members and do not need to be the
basis for a hierarchy method. Therefore, in the previous example, you might have Product Description and
Product Image attributes. These attributes are not used for a hierarchy nor to sort or group members, but
are often needed to provide additional detail to dimension members without having to drill down to
relational data.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 3-5

Attribute Column Bindings


To control output from attributes, you can define
the column that uniquely identifies attribute
values, the column that users see, and an optional
value column you can use for MDX calculations.

The KeyColumn is the column, or columns,


that uniquely identify members for each
attribute value in the fact table. This is often
the primary key of the dimension table, and is
used when you choose to order the hierarchy
by key. In some cases, a member must be
identified by a combination of column values.
For example, in a geography dimension, the
City value Paris might not be unique, as there might be a row for Paris, France and another row for
Paris, Texas in the United States. You can either use the primary key column of the dimension table to
identify the correct Paris, or a multi-column key that includes the city, state, and country or region
to uniquely identify members in your dimension. This is particularly important if you plan to include
an attribute in a hierarchy.

The NameColumn provides the value that a user will see and can give more useful information than
the key column. You can define a name column as a calculated column you have created in a data
source view. For example, an Employee dimension might have an EmployeeID key column, but you
might want the name of the employee to be displayed. To do this, you can create a calculated
column based on first and last name in the data source view, and then use this as the name column.

The ValueColumn is returned by the MDX MemberValue function. This allows you to create
calculations based on a value other than the name or key. For example, a time dimension is stored in
a dimension table with one row for each day. The key column is a SMALLINT and the name column is
a CHAR(11) that displays the day, month, and year. The value column contains the date as a
SMALLDATETIME. This allows calculations to use the true date value rather than converting it from
string values, but also enables the key column to hold a smaller value. A key column value occurs in
every row of the fact table, so reducing its size makes the fact table smaller and makes joins between
the fact table and the time dimension faster.

Dimension Types
Initially all dimensions have a type of Regular, but
you can also define the type as Time,
Organization, Geography, BillOfMaterials,
Accounts, Customers, Products, Scenario,
Quantitative, Utility, Rates, Channel, or
Promotion.

In most cases the dimension type has no effect in


Analysis Services and is merely a value that is
passed to client applications, for example to alter
formatting. However, there are three dimension
typesTime, Currency and Accountsthat have
a functional effect in Analysis Services.
MCT USE ONLY. STUDENT USE PROHIBITED
3-6 Working with Cubes and Dimensions

Time Dimensions
If your time data is stored as a regular dimension (not as a server time dimension), and it has a Time
dimension type, you can add time intelligence. The Business Intelligence Wizard will create a hierarchy for
the dimension and enable you to use time-dependent MDX functions such as Periodstodate.

Currency Dimensions
Currency dimensions store values in one base currency and then convert to a local currency based on the
locale of the client. You must have a measure group that contains the exchange rate.

To create a currency dimension change the Type. You can then run the Business Intelligence Wizard that
will design the currency conversion process.

Accounts Dimensions
Financial accounts are not standard dimensions. The aggregation of accounts is non-standard and,
without a mechanism to add specific accounts, you would have to create custom rollup formulas and
calculated members. Accounts dimensions are parent-child where accounts are children of other
accounts.
To create a financial account dimension, set the dimension type to Accounts, and then run the Business
Intelligence Wizard to add account intelligence. The wizard will then guide you through the process of
defining individual accounts as an account type such as income or expense.

Note: For more information about how to Add Account Intelligence to a Dimension, go to
http://go.microsoft.com/fwlink/?LinkID=246781. For more information about Dimension Types,
see http://go.microsoft.com/fwlink/?LinkID=246782.

Role-Playing Dimensions
Some dimensions can be reused for multiple
relationships with the same measure group or fact
table. For example, an Order table might be
related to a dimension on both OrderDate and
ShipDate, so users can choose to aggregate
orders by the date on which they were placed, the
date on which they were shipped, or both.
Dimensions that can be used for multiple
relationships are known as role-playing
dimensions.

Another example, from the insurance industry,


could show a fact table with policy values which
might contain BuyerID and SellerID columns. These IDs can relate to an insurance brokers dimension
table on an InsuranceBrokerID column. In this case, an insurance broker may be either the seller or a
buyer of an insurance policy.
In multidimensional data modeling, SQL Server Analysis Services creates a role-playing dimension for each
relationship defined in the logical model. In the date example described earlier, two dimensions would be
created:

Date (Order Date)

Date (Ship Date)


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 3-7

An important point to note is that, although the cube has two dimensions, in reality, only one physical
dimension exists. Any changes you make to the underlying dimension (such as defining hierarchies) will
be applied to all role-playing dimensions that are based on it.
MCT USE ONLY. STUDENT USE PROHIBITED
3-8 Working with Cubes and Dimensions

Lesson 2
Defining Attribute Hierarchies
Hierarchies define the multi-level structure of a dimension and the relationships between the levels can
affect query and processing performance.

Lesson Objectives
After completing this lesson, you will be able to:

Describe hierarchies.

Distinguish parent-child from balanced hierarchies and describe when each type would be used.

Describe ragged hierarchies and the options you have to configure them.

Describe considerations for using hierarchies.

Create attribute relationships.

What Are Hierarchies?


Attributes are exposed to users through attribute
hierarchies. An attribute hierarchy in a dimension
includes an optional All level and distinct
members of the attribute. For example, a
Customer dimension might include a Name
attribute hierarchy with two levels: the All level
and a second level with a member for each name.
Attribute hierarchies define the space of a cube,
which you can think of as the multidimensional
space created by the product of its attribute
hierarchies. As well as being containers for
attribute hierarchies, dimensions can also include
user hierarchies as a navigational convenience, but these do not affect cube space.

Defining relationships between hierarchy levels enables Analysis Services to establish more useful
aggregations to increase query performance. This can also save memory during processing performance,
which is often important with large or complex cubes.

When attributes are arranged into user-defined hierarchies, you outline relationships between hierarchy
levels. The levels are connected in a many-to-one or one-to-one relationship, referred to as a natural
relationship. For example, in a Calendar Time hierarchy, a Day level should be related to the Month
level, the Month level to the Quarter level, and so on.

A natural hierarchy is composed of attributes where each is a member property of the attribute below. For
example, a Customer Geography hierarchy with the levels Country-Region, State-Province, City,
Postal Code and Customer, is a natural hierarchy because of the relationships between the attributes. By
contrast, a Marital Status-Gender hierarchy containing the levels Marital Status and Gender, is non-
natural, because marital status and gender do not have a hierarchical relationship to each other.

For performance reasons, natural hierarchies are preferred but the dimension designer will warn you if
you create a hierarchy that is non-natural. You can ignore this warning when it makes sense to use a non-
natural hierarchy. For example, it is plausible that you would want to drill down from marital status to
gender. On its own, the gender is of no use but, when combined with marital status, it returns useful data.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 3-9

Parent-Child Hierarchies
A parent-child hierarchy exists when you have a
self-referencing dimension table. For example, an
Employee dimension might have a Manager
attribute. With the exception of the most senior
employee, all other employees have a position in
the hierarchy defined by the identities of their
managers and of those they manage.

A parent-child hierarchy is an unbalanced


hierarchy. This imbalance occurs when the number
of levels from leaf to root is different for individual
leaf members. For example, some managers might
have three levels of employee working for them
while others have just two.

Parent-child hierarchies are defined, not by the order in which attributes are added, but by the
relationships in the dimension table.
Parent-child dimensions are changing dimensions, which means that rows can be modified without the
need for processing. They are also the only dimension type you can write-enable.

The reason that parent-child dimensions can be read-only and are changing dimensions is that the
hierarchy structure is calculated at run time. This can be beneficial if the dimension records change
frequently but there are performance considerations. Because the parent-child hierarchy is calculated at
run time, no aggregates are calculated when the cube is processed. For example, the total sales for each
sales manager and every employee who reports to them is only calculated when the query is run.

This trade-off between a versatile changing dimension and query performance must be considered and,
ideally, the effects tested. If you do not want to use a parent-child dimension, others can be recreated as
traditional dimensions. For example, an employee could list their level managers up through the
hierarchy. This would allow the dimension aggregates to be calculated during processing, but with a loss
of flexibility.

You should also consider what to do with non-leaf data. In the example we have used, the sales manager
can also make sales. If this is the case, and you have data in the fact table associated to non-leaf members,
you must change the dimension's Members With Data property to Nonleaf data visible or Nonleaf
data hidden. Otherwise, processing the cube fails. If you use Nonleaf data visible the parent in this
case the sales manager will also appear in the level below and seem to report to themselves. If you use
Nonleaf data hidden, the sales manager will not appear in the level below. If, however, you perform a
sum on all employee sales, the value will not include the sales manager and, therefore, will not equal the
department total.

Note: For more information about parent-child hierarchies, go to


http://go.microsoft.com/fwlink/?LinkID=246783.
MCT USE ONLY. STUDENT USE PROHIBITED
3-10 Working with Cubes and Dimensions

Ragged Hierarchies
To the user, ragged and parent-child hierarchies
often seem similar. They both have a different
number of levels in separate parts of the hierarchy,
but these are formed from different dimension
table columns rather than from a self-referencing
relationship. For example, in a Location dimension
you might have Location, Region, State and
Country levels, but the State level may only be
used for the United States, Canada, and Australia.
All other countries may skip the State level.

The HideMemberIf property makes a regular


hierarchy ragged. There are five possible values for
this property:

Never. This value creates a regular hierarchy.

OnlyChildWithNoName. This value hides a level member when it is an only child and is null or a
zero-length string.

OnlyChildWithParentName. This value hides a level member when it is an only child with the same
name as its parent.
NoName. This value hides a level member when it is null or a zero-length string.

ParentName. This value hides a level member when it has the same name as its parent.

You can use a ragged hierarchy as an alternative to a parent-child dimension. The aggregates are
calculated when the cube is processed, improving query performance. A ragged hierarchy is not, however,
a changing dimension so the flexibility of parent-child dimensions is lost.

Using Hierarchies
You typically create user-defined hierarchies to
allow users to drill up and drill down through the
data, and to allow Analysis Services to create
meaningful aggregations of the measures.

There are several important hierarchy properties:

The IsAggregatable property defines


whether an All level is created. In some
scenarios, an All level is irrelevant. For
example, the Time dimension in a solution
holds data from January 12, 1995, because
earlier data is too costly to convert and gives
minimal business benefit. This range of dates
is not relevant to the business and does not need to be aggregated.

The AttributeHierarchyOrdered property defines whether the hierarchy is ordered. If this is set to
False and you do not query the attribute hierarchy, you will reduce processing time. If you only use
an attribute hierarchy to order another attribute hierarchy, it is not queried.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 3-11

The AttributeHierarchyOptimizedState can be set to NotOptimized to prevent Analysis Services


from creating indexes on the attribute hierarchy. This will reduce processing time but increase query
time, and should only be used when you do not query the attribute hierarchy.

Hiding and disabling attribute hierarchies can improve performance and focus data for users. For
example:

If you set the AttributeHierarchyEnabled property to False, the hierarchy is disabled and Analysis
Services only creates the attribute hierarchy as a member property. This is useful if the attribute is
providing detail, but you do not want to use it as a level for aggregation.

If you set the AttributeHierarchyVisible property to False, the attribute is only visible from user-
defined hierarchies. This is useful if there are large numbers of distinct values in the attribute
hierarchy that would cause confusion and not add any benefit.

Attribute Relationships
Attributes are always related directly or indirectly
to the key attribute. Initially all attributes are
directly related to the key attribute and this might
be the relationship you require. For example, a
Product dimension contains ProductKey, Name,
Weight, ProductionCost and
RecommendedSalesCost. In this dimension you
do not group by any attribute, and you only need
the All level and one other level containing every
member.

In most dimensions, however, there are levels with


relationships between them. For example, you
have a Customer dimension containing attributes for CustomerKey, Name, Address, City, Region, and
Country. Name and Address relate directly to the CustomerKey, but you want to create relationships for
the other attributes. You form a relationship, from Country to Region, from Region to City, and from
City to CustomerKey, which supports levels within the data and will be used to create aggregates to
improve query performance.

Attribute relationships are straightforward to create using the Attribute Relationships tab of the
dimension designer. You can right-click a blank space to create a dimension relationship, and then choose
the source and related attributes. Each source attribute can only have one related attribute. For example,
a month can only have one quarter, although a quarter can have several months. In this relationship, the
month is the source attribute and the quarter is the related attribute
MCT USE ONLY. STUDENT USE PROHIBITED
3-12 Working with Cubes and Dimensions

Demonstration: Creating a Parent-Child Hierarchy


In this demonstration, you will see how to:

Create a Parent-Child Hierarchy.

Name the Levels in a Parent-Child Hierarchy.

Demonstration Steps
Create a Parent-Child Hierarchy

1. Ensure that the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and log on
to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd. Then in the
D:\Demofiles\Mod03 folder, run Setup.cmd as Administrator.

2. Start Visual Studio and open Demo.sln in the D:\Demofiles\Mod03 folder.

3. In Solution Explorer, expand Data Source Views and double-click Adventure Works Demo DSV.dsv
to open it.

4. On the Data Source View menu, click Add/Remove Tables. Then in the Add/Remove Tables
dialog box, add the DimEmployee table to the Included objects list and click OK.

5. In the data source view diagram, right-click the DimEmployee table and click New Named
Calculation. Then in the Create Named Calculation dialog box, create a named calculation with the
column name Full Name and the expression [FirstName] + ' ' + [LastName] and click OK.

6. In Solution Explorer, right-click Dimensions and click New Dimension.

7. On the Welcome to the Dimension Wizard page, click Next.


8. On the Select Creation Method page, select Use an existing table and click Next.

9. On the Specify Source Information page:

o In the Main Table box select DimEmployee.

o In the Key column list, ensure that EmployeeKey is selected.

o In the Name Column box select Full Name.

o Click Next.

10. On the Select Related Tables page, clear the Sales Territory check box so that it is not selected, and
click Next.

11. On the Select Dimension Attributes page, click Next.


12. On the Completing the Wizard page, in the Name box, type Employee and click Finish.

Name the Levels in a Parent-Child Hierarchy

1. In the Employee.dim dimension designer, on the Dimension Structure tab, In the Attributes
section, click Parent Employee Key and press F4.

2. In the Properties pane, note that the Usage property for this attribute has been set to Parent.

3. In the NamingTemplate, property value click the ellipses () button.

4. In the Level Naming Template dialog box, in the first blank Name cell, type Executive.

5. In the next blank Name cell, type Middle Management.

6. In the next blank Name cell, type Senior Employee.


7. In the next blank Name cell, type Junior Employee. Then click OK.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 3-13

8. In the Properties pane change the MembersWithData property value to NonLeafDataHidden.

9. In the Attributes pane, click the Employee dimension (at the root of the attributes tree), and set the
UnknownMember property to None. Then select the Employee Key attribute, expand its
KeyColumns property, expand the DimEmployee.EmployeeKey column, and set the
NullProcessing property to Automatic.

10. In Solution Explorer, right-click Demo and click Deploy. If prompted, specify the password Pa$$w0rd
for the ADVENTUREWORKS\ServiceAcct user.

11. When deployment has completed, in the Employee.dim dimension designer, in the Browser tab,
expand the employees in the hierarchy. Note that, when you select an employee, the hierarchy level
name is displayed at the top of the browser area.
MCT USE ONLY. STUDENT USE PROHIBITED
3-14 Working with Cubes and Dimensions

Lesson 3
Sorting and Grouping Attributes
A cube can be difficult to navigate if it has numerous attributes and attribute hierarchies. In Analysis
Services, you can sort by the member name, the member key, or by a related attribute. You can also
group attributes. A member group is a system-generated collection of consecutive dimension members.
To improve usability in Analysis Services, attribute members can be formed into member groups through
a process called discretization.

Lesson Objectives
This lesson describes sorting and grouping attributes. After completing this lesson, you will be able to:

Sort attributes.

Group attributes.

Sorting Attributes
You can choose any attribute by which to sort the
hierarchy. This attribute can be the key, the name,
or any other secondary attribute in the dimension
table.

If you order by name, Analysis Services will


order the members in alphanumeric order.

If you order by key, you can specify one or


more key columns and use these to sort the
members. This allows you to sort dates by
quarters. With a single key, the order of
quarters would be Q1, Q2, Q3, and Q4 with all
years aggregated. For example, Q1 would
aggregate all Q1 data from every year. If you add the Year column to the key, the order would be Q1
2000, Q2 2000, Q3 2000, Q4 2000, Q1 2001, Q2 2001, Q3 2001, and so on.

You can also order by a secondary attribute, which could be a standard or calculated column created
in the data source view. For example, you might want to sort a Course dimension in which the course
name runs from Course 1 to Course 450. The course key is provided by the course vendor and is
not a relevant sort order. If you sort by name, the order is Course 1, Course 11, Course 111,
Course 2, Course 22, and so on. Therefore, you create a new column in the dimension table with
the course number as a numeric field. This can then be used to sort data in the correct order.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 3-15

Grouping Attributes
Some hierarchies have no natural levels, so
Analysis Services only creates All and Leaf levels.
This can make the cube difficult to navigate if
there are many members in the hierarchy. You can
use grouping to organize members into groups to
simplify cube browsing. For example, customers
could be grouped into income brackets rather
than showing individual incomes. You can then
drill down into the groups to show the detail as
needed.

To enable grouping, you must set the


DiscretizationMethod property of the attribute.
This can be set to:

Equal Areas. To divide the members into groups with an equal number of members.

Clusters. To use a clustering algorithm to group members based on the training data. This can form
useful groups, but has a higher processing cost.

To specify the number of groups, you must set the DiscretizationBucketCount property. The default is
based on the square root of the number of distinct members.
To specify a naming template, you must set the Format option for the NameColumn property of an
attribute. The default naming template displays the first and last group members in the format January
March. You can create your own naming templates to modify group names. Because naming templates
are based on the members they contain, the names of groups can change as members are added or
removed.
MCT USE ONLY. STUDENT USE PROHIBITED
3-16 Working with Cubes and Dimensions

Lab: Defining Dimensions


Scenario
You have created a cube and tested it with a small set of business users whose feedback says that the
cube shows potential but is difficult to navigate intuitively. To resolve this problem, you plan to refine the
cubes dimensions to include the attributes by which users want to aggregate data, and hierarchies that
make it easy to do so at multiple levels.

Objectives
After completing this lab, you will be able to:

Configure dimensions and attributes.

Create hierarchies.
Create a hierarchy with attribute relationships.

Create a ragged hierarchy.

Browse dimensions and hierarchies.

Estimated Time: 60 minutes

Virtual machine: 20466C-MIA-SQL

User name: ADVENTURWORKS\Student


Password: Pa$$w0rd

Exercise 1: Configuring Dimensions and Attributes


Scenario
Several users in the Adventure Works Cycle company want to be able to view data across time periods, so
you need to configure the Date dimension as a Time dimension so that Analysis Services can apply
temporal calculations to values.

The main tasks for this exercise are as follows:

1. Prepare the Lab Environment

2. Remove Unused Attributes

3. Add Dimension Intelligence

4. Group Attribute Members

Task 1: Prepare the Lab Environment


1. Ensure the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then log
on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. Run Setup.cmd in the D:\Labfiles\Lab03\Starter folder as Administrator.

Task 2: Remove Unused Attributes


1. Open the Adventure Works OLAP.sln solution in the D:\Labfiles\Lab03\Starter folder in Visual Studio.

2. Open the Customer dimension in the dimension designer and note that many attributes have been
added to allow business users to aggregate measures in many different ways. However, users have
complained that some of these attributes are unnecessary and that they should be removed to make
browsing the cube simpler.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 3-17

3. Delete the Commute Distance, Number Cars Owned, and Number Children At Home attributes
from the Customer dimension.

4. Similar feedback has been received about the Product dimension, so delete the Days To
Manufacture and Safety Stock Level attributes from the Product dimension.

Task 3: Add Dimension Intelligence


1. Use the Business Intelligence Wizard on the Date dimension to define dimension intelligence.

2. Specify that the dimension is a Time dimension.

3. Map the dimension attribute columns as follows:

o Year: Calendar Year

o Half Year: Calendar Semester

o Quarter: Calendar Quarter

o Month: Month

o Date: Full Date Alternate Key

Task 4: Group Attribute Members


1. Open the Customer dimension in dimension designer, and explore the data in the Customer table.
Notice the range of values for the YearlyIncome column.

2. Process the dimension and browse the Yearly Income hierarchy.

o If you are prompted to redeploy the project, do so.

o If you are prompted for impersonation credentials, enter the password Pa$$w0rd for the
ADVENTUREWORKS\ServiceAcct account.

3. Browse the dimension and view the Yearly Income hierarchy. Note that this hierarchy has no
structure.
4. Modify the following properties of the Yearly Income attribute:

o DiscretizationMethod: Automatic

o DiscretizationBucketCount: 5

o OrderBy: Key

5. Reprocess the dimension and browse the Yearly Income hierarchy to verify that the values are now
grouped into five income bands, with a sixth band for customers whose income is unknown.

Results: After this exercise, the Customer and Product dimensions have had some attributes removed,
time intelligence has been added to the Date dimension, and a Gender-Marital Status hierarchy has been
created in the Customer dimension.
MCT USE ONLY. STUDENT USE PROHIBITED
3-18 Working with Cubes and Dimensions

Exercise 2: Creating Hierarchies


Scenario
Business users need to view aggregations at different levels. Specifically, analysts want to view business
results for product categories, and then drill down to see details for subcategories and individual
products. Analysts also want to view customers by gender and then, for each gender, view business results
based on marital status.

The main tasks for this exercise are as follows:

1. Create a Natural Hierarchy

2. Create a Non-Natural Hierarchy

Task 1: Create a Natural Hierarchy


1. Edit the Product.dim dimension to create a hierarchy that contains the following attributes:

o English Product Category Name

o English Product Subcategory Name

o English Product Name


Note: If a warning is displayed, notifying you that attribute relationships do not exist and performance
may be decreased, ignore it. You will see how to use attribute relationships to optimize a hierarchy later in
this lab.
2. Rename the hierarchy to Categorized Products.

3. Rename the hierarchy levels as follows:

o Category

o Subcategory

o Product

4. Set the AttributeHierarchyVisible property of all attributes to False so that the Categorized
Products hierarchy is the only way to browse the Product dimension.

5. Process the dimension when you have finished.

o If you are prompted to redeploy the project, do so.

o If you are prompted for impersonation credentials, enter the password Pa$$w0rd for the
ADVENTUREWORKS\ServiceAcct account.

6. Browse the Categorized Products hierarchy in the dimension browser.


7. Save the dimension when you have finished.

Task 2: Create a Non-Natural Hierarchy


1. In the Customer dimension, create a hierarchy named Gender-Marital Status that includes the
following attributes:

o Gender

o Marital Status

Note: If a warning is displayed, saying that attribute relationships do not exist and performance may be
decreased, ignore it. You will see how to use attribute relationships to optimize a hierarchy later in this
lab.

2. Process the dimension when you have finished.


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 3-19

o If you are prompted to redeploy the project, do so.

o If you are prompted for impersonation credentials, enter the password Pa$$w0rd for the
ADVENTUREWORKS\ServiceAcct account.

3. Browse the Gender-Marital Status hierarchy in the dimension browser. Note that gender can be F,
M, or Unknown, and that marital status can be M or S.

4. Save the dimension when you have finished.

Results: After this exercise, you should have created a Categorized Products hierarchy and a Gender-
Marital Status hierarchy.

Exercise 3: Create a Hierarchy with Attribute Relationships


Scenario
A large proportion of the analysis performed at Adventure Works includes assessing business performance
over specific time periods. For this reason, analysts need to aggregate business measures in a temporal
hierarchy that includes year, semester, quarter, month, and day.
The main tasks for this exercise are as follows:

1. Configure Attribute Column Bindings

2. Create Attribute Relationships


3. Create a Hierarchy

Task 1: Configure Attribute Column Bindings


1. In the dimension designer for the Date dimension, modify the Calendar Semester attribute so that a
semester is uniquely identified by both the calendar year and semester, but displayed using only the
calendar semester value. To do this:

o In the properties for the Calendar Semester attribute, set the KeyColumns property so that the
key columns are CalendarYear followed by CalendarSemester.
o Set the NameColumn property so that the source column value is CalendarSemester.

o Set the ValueColumn property so that the source column value is CalendarSemester.

2. Modify the Calendar Quarter attribute so that a quarter is uniquely identified by both the calendar
year and the calendar quarter, but displayed using only the calendar quarter value. To do this:

o In the properties of the Calendar Quarter attribute, set the KeyColumns property so that the
key columns are CalendarYear followed by CalendarQuarter.

o Set the NameColumn property so that the source column value is CalendarQuarter.

o Set the ValueColumn property so that the source column value is CalendarQuarter.

3. Modify the Month attribute so that a month is uniquely identified by both the calendar year and the
month number of year, but displayed using the month name value. To do this:

o Set the KeyColumns property so that the key columns are Calendar Year followed by
MonthNumberOfYear.
o Set the NameColumn property so that the source column value is EnglishMonthName.

o Set the ValueColumn property so that the source column value is EnglishMonthName.
MCT USE ONLY. STUDENT USE PROHIBITED
3-20 Working with Cubes and Dimensions

Task 2: Create Attribute Relationships


1. In the Date dimension, create the following attribute relationships:

o Full Date Alternate Key > Month (Rigid)

o Month > Calendar Quarter (Rigid)

o Calendar Quarter > Calendar Semester (Rigid)

o Calendar Semester > Calendar Year (Rigid)

2. Save the dimension when you have finished.

Task 3: Create a Hierarchy


1. Modify the Date dimension structure to create a new hierarchy named Calendar Date that contains
the following levels:

o Calendar Year

o Calendar Semester

o Calendar Quarter

o Month

o Full Date Alternate Key (renamed to Day)

2. Set the AttributeHierarchyVisible property of the attributes that are included in the Calendar Date
hierarchy to False so that they can only be used by browsing the hierarchy.

3. Process the dimension when you have finished.

o If you are prompted to redeploy the project, do so.


o If you are prompted for impersonation credentials, enter the password Pa$$w0rd for the
ADVENTUREWORKS\ServiceAcct account.

4. Browse the Calendar Date hierarchy in the dimension browser. Note that months are displayed in
alphabetical rather than chronological order.

5. Modify the Month attribute to set its OrderBy property to Key. The key for this attribute consists of
the Calendar Year and Month Number of Year columns, so sorting by key should ensure that
months are displayed in chronological order.

6. Reprocess the dimension and verify that the months are now displayed in the correct order.

Results: At the end of this exercise, you will have a hierarchy named Calendar Date.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 3-21

Exercise 4: Creating a Ragged Hierarchy


Scenario
Another way in which users want to simplify analysis is to group dimension attributes into hierarchies that
can summarize data at multiple levels. Specifically, dates must be organized in temporal hierarchies, such
as years, semesters, quarters, and months. Customers should be grouped into hierarchies based on
geographical address, marital status and gender while products should be grouped based on category
and subcategory.
The main tasks for this exercise are as follows:

1. Configure Attribute Column Bindings

2. Create Attribute Relationships

3. Create a Hierarchy

Task 1: Configure Attribute Column Bindings


1. In the Sales Territory dimension, modify the Sales Territory Country attribute so it is uniquely
identified by both the sales territory group and country, but displayed using only the sales territory
country value. To do this:

o In the properties for the Sales Territory Country attribute, set the KeyColumns property so that
the key columns are SalesTerritoryGroup followed by SalesTerritoryCountry.
o Set the NameColumn property so that the source column value is SalesTerritoryCountry.

o Set the ValueColumn property so that the source column value is SalesTerritoryCountry.

2. Modify the Sales Territory Region attribute so it is uniquely identified by the sales territory group,
country, and region but displayed using only the sales territory region value. To do this:

o In the properties of the Sales Territory Region attribute, set the KeyColumns property so that
the key columns are SalesTerritoryGroup followed by SalesTerritoryCountry, followed by
SalesTerritoryRegion.

o Set the NameColumn property so that the source column value is SalesTerritoryRegion.

o Set the ValueColumn property so that the source column value is SalesTerritoryRegion.

3. Save the dimension.

Task 2: Create Attribute Relationships


1. In the Sales Territory dimension, create the following attribute relationships:

o Sales Territory Region > Sales Territory Country (Rigid)


o Sales Territory Country > Sales Territory Group (Flexible)

2. Save the dimension when you have finished.

Task 3: Create a Hierarchy


1. Modify the Sales Territory dimension structure to create a new hierarchy named Sales Territory that
contains the following levels:

2. Sales Territory Group

3. Sales Territory Country

4. Sales Territory Region


MCT USE ONLY. STUDENT USE PROHIBITED
3-22 Working with Cubes and Dimensions

5. Set the AttributeHierarchyVisible property of the dimension attributes to False so that this
dimension can only be used by browsing the hierarchy.

6. Process the dimension when you have finished.

o If you are prompted to redeploy the project, do so.

o If you are prompted for impersonation credentials, enter the password Pa$$w0rd for the
ADVENTUREWORKS\ServiceAcct account.

7. Browse the Sales Territory hierarchy in the dimension browser and note the following:

o Territories with no region level display the country name instead.

o There is an unknown member for members where the sales territory is unknown, but the NA
value in the database is already used for this purpose.

8. Modify the Sales Territory dimension to set its UnknownMember property to None. Then set the
NullProcessing property of the Sales Territory Key KeyColumns property to Automatic.

9. Set the HideMemberIf property of the Sales Territory Region hierarchy level to
OnlyChildWithParentName.
10. Reprocess the dimension and verify that the unknown member has been removed, and that sales
territories without regions cant be expanded beyond the country level.

11. Save the dimension when you have finished.

Results: After this exercise, you should have created a Customer Geography hierarchy.

Exercise 5: Browse Dimensions and Hierarchies in a Cube


Scenario
Now you have implemented the dimensions and hierarchies required by the business analysts, you must
test them.

The main tasks for this exercise are as follows:

1. Process the Cube

2. Browse the Cube in Visual Studio

3. Browse the Cube in Excel

Task 1: Process the Cube


1. Process the Sales cube.

o If you are prompted to redeploy the project, do so.


o If you are prompted for impersonation credentials, enter the password Pa$$w0rd for the
ADVENTUREWORKS\ServiceAcct account.

Task 2: Browse the Cube in Visual Studio


1. Use the cube browser to browse the Sales cube, and view the Internet Revenue measure.

2. Add the Gender-Marital Status hierarchy from the Customer dimension to the query, and note that
Internet revenue is broken down by every combination of gender and marital status.

3. Add the Yearly Income hierarchy to the query filter, and filter the results to include only revenue
from customers in the highest income band.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 3-23

Task 3: Browse the Cube in Excel


1. Analyze the cube in Excel, and view the Internet Revenue measure.

2. Add the Calendar Date hierarchy in the Order Date dimension to the PivotTable, and note that you
can drill down to see revenue at the year, semester, quarter, month, and individual date levels.

3. Add the Sales Territory hierarchy in the Sales Territory dimension to the PivotTable, and note that
you can drill down to view sales territory countries for all sales territory groups, but only the United
States has a third sales territory level.

4. When you have finished, close Excel without saving the workbook, and close Visual Studio.

Results: At the end of this exercise, you will have tested the dimensions and hierarchies you created in the
lab.
MCT USE ONLY. STUDENT USE PROHIBITED
3-24 Working with Cubes and Dimensions

Module Review and Takeaways


In this module, you have learned how to configure dimensions and attributes in a cube.

Review Question(s)
Question: Aside from organizational charts, where else might you find parent-child
hierarchies?

Question: When might you want to use a secondary attribute for sorting data?
MCT USE ONLY. STUDENT USE PROHIBITED
4-1

Module 4
Working with Measures and Measure Groups
Contents:
Module Overview 4-1

Lesson 1: Working with Measures 4-2

Lesson 2: Working with Measure Groups 4-6

Lab: Configuring Measures and Measure Groups 4-13

Module Review and Takeaways 4-17

Module Overview
Measures and measure groups specify the data values your cube can aggregate to provide summary
values for analysis. This module describes measures and measure groups. It also explains how you can use
them to define fact tables and associate dimensions with measures.

Objectives
After completing this module, you will be able to:

Configure measures.
Configure measure groups.
MCT USE ONLY. STUDENT USE PROHIBITED
4-2 Working with Measures and Measure Groups

Lesson 1
Working with Measures
This lesson explains how to work with measures, including configuring how measures are displayed and
aggregated, and information about measure values.

Lesson Objectives
After completing this lesson, you will be able to:

Describe the concept of a measure.

List the properties of measures.

Configure measure formatting.

Choose appropriate aggregation functions for required additivity.

Introducing Measures
A measure represents a column that contains
quantifiable data, usually numeric, that you can
aggregate. Typically, measures occur as a column
in the fact table and are the values the business
user finds most interesting. Measures are
organized into measure groups.
For example, in a retail cube with dimensions for
stores, products, time, and salespeople, the
measures might include sales value and units.
These measures are then aggregated to provide
total sales units and value for each store, product,
day, and salesperson.
To provide specific aggregated details, you could pick a store from the stores dimension, a product from
the products dimension, a day from the time dimension, a salesperson from the salesperson dimension,
and either sales units or sales value from the measures dimension. Although measures are part of the
measures dimension, they are distinct from all other dimensions and are typically referred to as measures.

You create measures as part of the Cube Wizard or on the Cube Structure tab of the Cube Designer.

Calculated Measures
You can create calculated measures based on other measures in the fact table. You use a measure
expression to define the value of a calculated measure, based on a fact table column as modified by a
Multidimensional Expression (MDX) statement. You should consider the performance impact of using
calculated measures because they are worked out at run time. If a calculated measure is infrequently used
and is quite simplefor example, the sum of two measuresit will have little performance impact on
queries, but might reduce processing times. If a measure is frequently used and involves a complex
calculation, it might cause unacceptable query times.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 4-3

Measures from Attribute Columns


You can use attribute columns from dimension tables to define measures, but these are typically
semiadditive or nonadditive in terms of their aggregation behavior. For example, an account balance can
be added for all customers, but should not be added for an individual customer over time. If a customer
has a positive balance of $10.00 and maintains this for a year, it should not be totaled by week as that
would create a balance of $520.00 rather than $10.00.

Aggregation
The aggregation behavior of each measure is determined by the aggregation function associated with the
measure.

Note: For more information about aggregation, see the Aggregation Functions topic later
in this lesson.

Granularity
You should consider the granularity of measures. You cannot make a measure more granular than the
highest level of granularity in the dimension tables, but you can make it less so. For example, if you have a
time dimension that stores data at levels of day, month, quarter, and year, you cannot store sales
measures at a granularity of hours, but they can be stored at a granularity of months. You can specify the
granularity of a measure group in relation to a specific dimension by using the Dimension Usage tab of
the Cube Designer

Note: Granularity refers to how specific something is. For example, an hour is more
granular than a day, and a day is more granular than a week.

Measure Properties
Measures have properties that enable you to
define how the measures function and control
how the measures appear to users. You can
configure measures on the Cube Structure tab of
the Cube Designer.
Measures inherit certain properties from the
measure group of which they are a member,
unless those properties are overridden at the
measure level.

Measure properties determine:

How measures are aggregated.


The data type of the column in the underlying fact table to which the measure is bound.

The column in the data source view to which the measure is bound.

The description of the measure, which may be exposed in client applications.

The name that is displayed to the user.

The folder in which the measure will appear when users connect to the cube.

The display format of the measure.


MCT USE ONLY. STUDENT USE PROHIBITED
4-4 Working with Measures and Measure Groups

The unique identifier (ID) of the measure. Note that this property is read-only.

Any MDX expressions that define the measure.

The visibility of the measure.

Configuring How Measures Are Displayed


You can configure the measure property
FormatString, which determines how measure
values are displayed to users, by using the
Properties window of a measure in the Cube
Designer.

Typical FormatString values include General


Number or Medium Date but, although a list of
display formats is provided, you can specify many
additional formats that are not in the list by
specifying any named or user-defined format that
is valid in Microsoft Visual Basic. This might
include changing the order of a date or
introducing currency symbols.

Note: The currency format will display the currency symbol of the clients location.
Therefore, the value may be incorrect and, in most scenarios, it is more appropriate to employ a
user-defined FormatString.

The table on the slide assumes that the regional setting in Control Panel on the client computer is English
(United States).

Aggregation Functions
The measure property AggregateFunction
determines how measures are aggregated, and
you can configure it by using the Properties
window of a measure in the Cube Designer. Some
measures do not use the Sum function and
aggregate in another way, for example, a profit
margin measure. For a supplier with around 100
products, this is typically about 50 percent. If you
display the aggregated profit margin for each
supplier, you would expect the value to be about
50, not 5,000, percent, so use the Avg rather than
Sum function.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 4-5

Aggregation functions have different levels of additivity that fall into one of three categories:

An additive measure. This is also called a fully additive measure that you can aggregate along all the
dimensions included in the measure group that contains the measure, without restriction. Additive
measures are the most commonly used in Microsoft SQL Server Analysis Services (SSAS) and consist
of the Sum and Count functions.

A semiadditive measure. You can aggregate this along some, but not all, dimensions that are
included in the measure group that contains the measure. An example of a semiadditive aggregation
is a units-in-stock value for warehouse goods. This would be additive by region to reveal the total
units in stock, but would not be additive by time because, if you add last week's stock to this week's
stock, you could end up with a value that lists twice as much stock as you really have.

A nonadditive measure. You cannot aggregate this along any dimension in the measure group that
contains the measure. Instead, you must individually calculate the measure for each cube cell that
represents the measure. For example, a calculated measure that returns a percentage, such as profit
margin, cannot be aggregated from the percentage values of child members in any dimension.

Use the AggregateFunction property to define the function that is used to aggregate the measure.
Commonly used aggregation functions include:

The Sum function. This is additive and calculates the sum of all values for every child member. Sum is
the default function.

The Count function. This is additive and calculates the quantity of child members.

The Min function. This is semiadditive and calculates the lowest value for each child member.
The Max function. This is semiadditive and calculates the highest value for each child member.

The DistinctCount function. This is nonadditive and calculates the count of all unique child members.

The None function. This is nonadditive, performs no aggregation, and supplies values directly from
the fact table.

Other semiadditive aggregation functions include:

The ByAccount function. This calculates aggregation according to the aggregation function that is
assigned to the account type for a member in an account dimension.

The AverageOfChildren function. This calculates the average of values for all non-empty child
members.

The FirstChild function. This retrieves the value of the first child member.

The LastChild function. This retrieves the value of the last child member.

The FirstNonEmpty function. This retrieves the value of the first non-empty child member.

The LastNonEmpty function. This retrieves the value of the last non-empty child member.

Note: For more information about how to Configure Measure Properties, go to


http://go.microsoft.com/fwlink/?LinkID=246784.
MCT USE ONLY. STUDENT USE PROHIBITED
4-6 Working with Measures and Measure Groups

Lesson 2
Working with Measure Groups
You use measure groups to associate dimensions with measures. Properties you can configure at the
measure group level include:

Partitions. These are containers for a portion of the measure group data.

Aggregations. These are pre-calculated summaries of data from leaf cells that improve query
response times.

Lesson Objectives
This lesson describes measure group properties and the relationships between measure groups and
dimensions. It also explains partitions, aggregations, and how to configure measure group storage.

After completing this lesson, you will be able to:

Describe measure groups.


List the properties of measure groups.

Describe the relationships between measure groups and dimensions.

Explain partitions and measure group storage.

Describe aggregations and their effect on performance.

Explain how to configure measure group storage.

Introducing Measure Groups


In a cube, measures are sorted into measure
groups by their underlying fact tables. You use
measure groups to associate dimensions with the
measures in the fact table.

A measure group defines:


The dimensions the measures can be
aggregated by. This is achieved by creating
relationships between measure groups and
dimensions in the cube. For example, you
might relate an Internet Sales measure group
to Product, Date, and Customer dimensions.
In the same cube you might relate a Reseller
Sales measure group to Product, Date, and Reseller dimensions. Measures in both groups can be
aggregated by product and date, Internet sales measures by customer, and reseller sales measures by
reseller.

The granularity at which the measures are related to the dimensions. For example, a cube might
include a Reseller Sales measure group that is related to the Date dimension based on a key for an
individual day (because each sale is associated with the specific date on which it occurred), and a
Sales Quota measure group related to the Date dimension based on the year (because sales quotas
are set for each year, not for individual days, weeks, or months).
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 4-7

How measure aggregations should be stored. For example, you might optimize cube performance by
storing pre-aggregated values for some or all of the possible aggregations for measures in a measure
group and its related dimensions.

You define one or more measure groups in Visual Studio by using the Cube Wizard, and then adding and
configuring measure groups with the Cube Designer.

The DistinctCount aggregation looks for the number of distinct values in a fact table column. For
example, you might want to know how many sales staff sold a particular product. This query is very
resource-intensive and, therefore, Analysis Services creates a separate group for DistinctCount measures.

Measure Group Properties


Measure group properties determine behaviors for
the entire measure group and set default
behaviors for certain properties of measures within
a measure group. Measure group properties
include:

AggregationPrefix. This specifies the


common prefix used for all aggregation
names of the measure group.

DataAggregation. This determines whether


SSAS can aggregate persisted data or cached
data for the measure group. The default value
is DataAndCacheAggregatable, which will
aggregate persisted and cached data. You can change this to CacheAggregatable or
DataAggregatable. DataAggregatable can be useful on systems with limited RAM because the
other two settings will attempt to keep the aggregations in memory to improve performance.

ErrorConfiguration. This provides configurable error-handling settings for handling of duplicate


keys, unknown keys, null keys, error limits, action upon error detection, and the error log file.

EstimatedRows. This specifies the estimated number of rows in the fact table. You use this when you
design aggregations. The value is static and should be kept approximately accurate, either manually
or by updating the count in the Aggregation Design Wizard.

EstimatedSize. This specifies the estimated byte size of the measure group. You use this when you
design aggregations.
ID. This specifies the identifier of the object.

IgnoreUnrelatedDimensions. This determines whether unrelated dimensions are forced to their top
level when members of dimensions unrelated to the measure group are included in a query.
ProactiveCaching. This defines how data is reprocessed when there are changes in the underlying
data source.

ProcessingMode. This indicates whether indexing and aggregating should occur during or after
processing. You can use lazy processing to enable SSAS to choose a time when workload is low but
you will not know when the indexing workload will occur or when indexing has taken place and
queries are at peak performance.
ProcessingPriority. This determines the processing priority of the cube during background
operations, such as lazy aggregations and indexing.
MCT USE ONLY. STUDENT USE PROHIBITED
4-8 Working with Measures and Measure Groups

StorageLocation. This is the file system storage location for the measure group. If none is specified,
the location is inherited from the cube that contains the measure group.

StorageMode. This determines the storage mode for the measure group.

Type. This specifies the type of the measure group.

Relationships Between Measure Groups and Dimensions


A relationship between a dimension and a
measure group consists of the dimension and fact
tables participating in the relationship and a
granularity attribute that specifies the granularity
of the dimension in the particular measure group.
This relationship is configured in the Dimension
Usage tab of the Cube Designer.

Note: When a measure group is updated,


dimension relationship information is unavailable
until the cube is processed.

Regular Dimension Relationship


A regular dimension relationship represents the relationship between dimension tables and a fact table in
a traditional star schema design. The key column for the dimension is joined directly to the fact table.

Reference Dimension Relationship


A reference dimension relationship represents the relationship between dimension tables and a fact table
in a snowflake schema design. The key column for the dimension is joined indirectly to the fact table.

Fact Dimension Relationship


If a dimension consists entirely of values in a fact table and does not have a separate dimension table, it
has a fact dimension relationship.

Many-to-Many Dimension Relationship


Typically, every dimension member relates to many facts, but all of those relate to just one member in
each dimension. This is a standard one-to-many relationship. You may also have many-to-many
relationships when, for example, a customer has multiple reasons for making a purchase, each reason
having numerous customers. In a relational database, this relationship would be achieved through an
intermediate table and, in SSAS, a similar technique is used by joining the dimension to an intermediate
fact table, by joining the intermediate fact table to an intermediate dimension, and by joining the
intermediate dimension to the fact table.

Note: Foreign-key relationships between all tables involved in a many-to-many relationship


must exist in the underlying data source view.

You can use the Granularity attribute list to define the granularity of a particular relationship. Most
cubes will have time data, so it is common to have a fact dimension table. This will store time data at the
highest level of granularity of any fact table but, although some fact tables may require data that is
accurate to the minute, many will have day, month, or quarter granularity. By specifying the granularity of
a measure group, you can use one time dimension for all these fact tables.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 4-9

Demonstration: Defining Relationships Between Dimensions and Measure


Groups
In this demonstration, you will see how to define a referenced relationship.

Demonstration Steps
Define a Referenced Relationship

1. Ensure that the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and log on
to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd. Then in the
D:\Demofiles\Mod04 folder, run Setup.cmd as Administrator.

2. Start Visual Studio, and open Demo.sln in the D:\Demofiles\Mod04 folder.

3. On the Build menu, click Deploy Solution. If prompted, specify the user name
ADVENTUREWORKS\ServiceAcct and the password Pa$$w0rd.

4. In Solution Explorer, double-click SalesDemo.cube, and on the Browser tab, expand Measures and
Reseller Sales and drag the Revenue measure to the query results area. Then expand the
Geography dimension and drag Country-Region to the left of the Revenue value. Note that the
values for each region are the same. The aggregation is incorrect.

5. In Solution Explorer, right-click Geography.dim, and then click View Designer. Note that this
dimension is based on the Geography table and has a Country-Region attribute.

6. In Solution Explorer, right-click Reseller.dim, and then click View Designer. Note that this dimension
is based on the DimReseller table, which includes a GeographyKey attribute that relates it to the
Geography table.

7. Click the tab for the SalesDemo cube, and on the Cube Structure tab, in the Data Source View
pane, note that there is no direct relationship between the Reseller Sales fact table and the
Geography dimension table.

8. On the Dimension Usage tab, click the intersection of the Reseller Sales measure group and the
Geography dimension, click the ellipses () button.

9. In the Select relationship type list, select Referenced.

10. In the Intermediate dimension list, select Reseller.

11. In the Reference dimension attribute list, select Geography Key, and in the Intermediate
dimension attribute list, select Geography Key. Then click OK.

12. On the Dimension Usage tab, in the Dimensions list, right-click Geography, and then click
Rename.

13. Change the name of this cube dimension to Reseller Geography, and then press Enter to make this
name change take effect.

14. On the Build menu, click Deploy Solution. If prompted, specify the user name
ADVENTUREWORKS\ServiceAcct and the password Pa$$w0rd.

15. On the Browser tab for the SalesDemo cube, click the Reconnect button. Then right-click the query
results area and click Clear Grid.

16. Expand Measures and Reseller Sales and drag the Revenue measure to the query results area. Then
expand the Reseller Geography dimension and drag Country-Region to the left of the Revenue
value. Note that the values for each region are now correct.
MCT USE ONLY. STUDENT USE PROHIBITED
4-10 Working with Measures and Measure Groups

Aggregations
Aggregations are pre-calculated summaries of leaf
cell data that improve query response time by
preparing the answers before the questions are
asked.

A simple Aggregation object consists of:

Basic information that includes the name of


the aggregation, the ID, annotations, and a
description.

A collection of AggregationDimension
objects that contain the list of granularity
attributes for the dimension.

Aggregations give multidimensional cubes their performance benefits. They pre-calculate the answer to
queries at processing time so that, when the query is run, performance is hugely improved. If you consider
that a query might have to aggregate thousands of individual values to produce one summarized output
value and there may be hundreds of these it becomes clear that pre-calculating these results will have
enormous benefits.

It might seem appropriate to pre-calculate every possible result but some aggregations provide little
benefit. The Aggregation Design Wizard gives options for you to specify storage and percentage
constraints on the algorithm to achieve a satisfactory trade-off between query response time and storage
requirements. For example, if SSAS stores aggregates at a month level, it takes little processing to
generate quarter data at query run time. Typically, there is little benefit from pre-processing more than 35
percent of aggregations, and you can specify this in the Aggregation Design Wizard.

SSAS incorporates a sophisticated algorithm to select aggregations for pre-calculation so that other
aggregations can be quickly computed from the pre-calculated values. To further improve aggregation
design for your specific system, you can use SSAS with normal workloads for some time, and then run the
Usage-Based Optimization Wizard. This is almost identical to the Aggregation Design Wizard but bases
optimization on queries submitted to SSAS and should, therefore, give better query response with less
processing time.

Configuring Measure Group Storage


There are three modes of multidimensional
storage in SSAS: Multidimensional Online
Analytical Processing (MOLAP), Relational Online
Analytical Processing (ROLAP), and Hybrid Online
Analytical Processing (HOLAP).

MOLAP
MOLAP storage stores both the aggregations and
a copy of the source data in the multidimensional
cube. MOLAP gives the greatest performance but
requires more storage space due to the
duplication. There is latency when you use MOLAP
storage because the cube data is refreshed only
when the cube is processed, so changes from the data source are only updated periodically.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 4-11

Typically, the cube is available during processing but query performance is affected and, if structural
changes occur to cube objects, the cube may need to be taken offline. To avoid these issues, you can
schedule processing for when the cube is rarely used, for example, at night. Often, this is not possible
because many systems are in use 24 hours a day, seven days a week, so you must use other strategies. If
you process the cube to a storage server, you can use database synchronization to copy the processed
data to the production server. You could also use proactive caching with MOLAP storage, but it
incorporates changes made to the data source by using notifications. While the cache is rebuilt with new
data, you can choose to send queries to ROLAP data, which is up-to-date but slower, or to the original
MOLAP storage, which is faster but will not have the new data.

ROLAP
ROLAP storage stores the aggregations in indexed relational data, along with the source data. It does not
use multidimensional storage, so ROLAP data is slower to query and process than MOLAP or HOLAP, but
it enables real-time access to data and uses less storage space.

HOLAP
HOLAP storage stores aggregations in the multidimensional cube and leaves source data in the relational
database. This can provide a good compromise when leaf-level data is rarely accessed. If leaf-level data is
frequently accessed, MOLAP would provide improved performance.

Note: You should also consider tabular data model storage, particularly as an alternative to
ROLAP and HOLAP. The tabular data model is covered in more detail in Module 10,
Implementing a Tabular Data Model with Microsoft PowerPivot.

Storage Settings
You can use the Storage Settings dialog box to configure storage.
Storage settings for partitions are set in the Storage Options dialog box, which you can open in one of
several different ways.

To configure default storage settings for new measure groups that are added to a cube:

In the Cube Wizard, on the Cube Builder tab, in either the Measures or Dimensions pane, click the
cube object, and in the Properties window, click the browse button for the ProactiveCaching
property setting.

To configure default storage settings for new partitions that are added to a measure group, in the
Cube Wizard on the Partitions tab, expand the measure group, and then click the Storage Settings
link for that measure group. This displays the Storage Settings dialog box for the selected measure
group.

To configure storage for an existing partition, in the Cube Wizard on the Partitions tab, expand the
measure group, and then use one of the following methods:
Right-click the partition, and then click Storage Settings.

Click the partition, and on the Cube menu, click Storage Settings.

Click the partition, and on the toolbar, click Set Proactive-cache Settings.
Click the partition, and in the Properties window, click the ellipses (...) button for the
ProactiveCaching property setting.
MCT USE ONLY. STUDENT USE PROHIBITED
4-12 Working with Measures and Measure Groups

Partitions
A partition is a container for all, or a portion of,
the measure group data.

SSAS Enterprise Edition uses partitions to manage


and store data and aggregations for a measure
group in a cube. By splitting the data into multiple
partitions, there are performance benefits:
If you partition based on date, you only need
to process the latest partition, which reduces
processing time.
You can store frequently used data differently
from aggregated data. This enables you to use
higher levels of aggregation and a faster storage mode to improve query times and reduce
processing times and storage requirements for older data.

Partitions enable the source data and aggregate data of a cube to be distributed across multiple hard
drives and among multiple server computers. You can use this to greatly improve performance on large
cubes.

When you incrementally update a partition, a temporary partition is created that has an identical structure
to the source partition. The temporary partition is then merged with the original partition. You should
consider data integrity when you incrementally update individual partitions. For example, if a change in
the source data results in a record moving from one partition to another, but only one partition is
updated, the data may either disappear or be duplicated.

Note: For more information about Remote Partitions, see


http://go.microsoft.com/fwlink/?LinkID=246785.

A simple Partition object consists of:


Basic information that includes the name of the partition, as well as the storage and processing
modes.

A slicing definition that is an MDX expression that specifies a tuple or a set.


An aggregation design that is a collection of aggregation definitions that you can share across
multiple partitions.

Typically, you can partition a measure group horizontally or vertically.


In a horizontally-partitioned measure group, each partition is based on a separate fact table that can be
from different data sources.

A vertically-partitioned measure group is based on a single table, with each partition centered on a source
system query that filters data for the partition.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 4-13

Lab: Configuring Measures and Measure Groups


Scenario
Business users are analyzing Internet sales with the multidimensional cube you have created but they also
need to analyze reseller sales. You have reseller sales data in the data warehouse on which the cube is
based, so you plan to add a second measure group containing reseller measures.

Users have also requested that you remove some measures that they do not need to use when analyzing
data, and also ensure that the remaining measures are clearly named.

Users specifically need to analyze reseller sales by product, so you must ensure the cube supports the
required relationship between the new reseller sales measure group and the Product dimension.

Finally, the data center administrator is concerned about the disk space used by the cube, but your users
are worried about performance when analyzing Internet sales. You must optimize the storage and
aggregations of the cube to balance these concerns.

Objectives
After completing this lab, you will be able to:

Configure measures.

Define a regular relationship.

Configure measure group storage.

Estimated Time: 60 minutes

Virtual machine: 20466C-MIA-SQL

User name: ADVENTUREWORKS\Student

Password: Pa$$w0rd

Exercise 1: Configuring Measures


Scenario
You are refining the OLAP cube for your company. Users can currently view measures for Internet sales,
but you have additional reseller sales data in your data warehouse that they would like to analyze. When
you add the reseller sales measures, your users want you to remove any measures not required for
business analysis, and ensure that measures in the cube are clearly named.

The main tasks for this exercise are as follows:

1. Prepare the Lab Environment

2. Create a Measure Group

3. Modify measure groups

Task 1: Prepare the Lab Environment


1. Ensure that the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then
log on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. Run Setup.cmd in the D:\Labfiles\Lab04\Starter folder as Administrator.


MCT USE ONLY. STUDENT USE PROHIBITED
4-14 Working with Measures and Measure Groups

Task 2: Create a Measure Group


1. Start Visual Studio and open the Adventure Works OLAP.sln solution in the D:\Labfiles\Lab04\Starter
folder.

2. In the Adventure Works DSV data source view, add the FactResellerSales table, together with all
related tables.

3. In the Sales cube, add a new measure group named Reseller Sales based on the FactResellerSales
table.

4. Review the names of the measures in the Reseller Sales measure group. Note that, when the
Reseller Sales measure group was set up, measures were created for all the numerical fields in the
FactResellerSales table.

Task 3: Modify measure groups


1. Delete the following measures from the cube:

o Revision Number
o Unit Price

o Extended Amount

o Unit Price Discount Pct


o Discount Amount

o Product Standard Cost

o Tax Amt
o Freight

Tip: Click the Show Measures Grid icon to view all the measures in the cube as a grid. In this view, you
can multi-select measures by holding the Ctrl key.
2. Rename the following measures:

o Total Product Cost (rename to Reseller Cost).

o Sales Amount (rename to Reseller Revenue).

o Fact Reseller Sales Count (rename to Reseller Sales Count).

Results: After this exercise, you should have created a new measure group for the FactResellerSales
table, removed unrequired measures, and renamed measures.

Exercise 2: Defining a Regular Relationship


Scenario
Your users have specifically stated that they must be able to analyze reseller sales by product, so you must
create the required relationships to support aggregating reseller sales measures by the Product
dimension.
The main tasks for this exercise are as follows:

1. View existing dimensions for measure groups

2. Create a dimension
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 4-15

Task 1: View existing dimensions for measure groups


1. Deploy the Adventure Works OLAP solution, entering the user name
ADVENTUREWORKS\ServiceAcct and the password Pa$$w0rd if prompted.

2. Browse the Sales cube and review the measures and dimensions available by selecting each measure
group.

o When the Internet Sales measure group is selected, the Customer dimension should be listed.

o When the Reseller Sales measure group is selected, there is no Customer dimension, because
reseller sales are sold to resellers, which are defined in a different dimension table.

Task 2: Create a dimension


1. Add a dimension named Reseller based on the DimReseller table to the solution.

o Do not include any related tables.

o Include only the Reseller Key, Business Type, and Reseller Name attributes.

2. On the Dimensions Usage tab of the cube designer for the Sales cube, verify that a regular
relationship has been defined between the Reseller Sales measure group and the Reseller
dimension.

3. Deploy the Adventure Works OLAP solution, and then browse the cube, verifying that you can view
reseller revenue by attributes of the Reseller dimension.

Results: After this exercise, you should have added a Reseller dimension that uses a regular relationship
with the Reseller Sales measure group to enable you to analyze reseller sales data.

Exercise 3: Configuring Measure Group Storage


Scenario
Users have requested that the cube is optimized to improve performance when analyzing Internet sales.
However, you must balance this need for optimization against the space required to store aggregations.

The main tasks for this exercise are as follows:


1. Configure Proactive Caching

2. Design Aggregations

Task 1: Configure Proactive Caching


1. Configure proactive caching for the Internet Sales measure group in the Sales cube. Specify that the
Automatic MOLAP setting should be used.

2. Configure proactive caching for the Reseller Sales measure group in the Sales cube. Specify that the
Automatic MOLAP setting should be used.

Task 2: Design Aggregations


1. Use the Aggregation Design Wizard to design aggregations for the Internet Sales measure group in
the Sales cube.

o Set all aggregation usage to default before starting the configuration.

o Use the wizard to count the objects in the measure group.

o Generate aggregations until the performance gain reaches 35%.


MCT USE ONLY. STUDENT USE PROHIBITED
4-16 Working with Measures and Measure Groups

o Name the aggregation you have generated InternetSalesAgg.

2. Use the Aggregation Design Wizard to design aggregations for the Reseller Sales measure group,
using the same settings as the Internet Sales measure group.

3. Deploy the Adventure Works OLAP solution.

4. Use SQL Server Management Studio to connect to the localhost instance of Analysis Services and
verify that the aggregations designs were deployed with the Sales cube in the Adventure Works OLAP
database.

Results: After this exercise, you should have defined the storage mode aggregations for the Internet
Sales and Reseller Sales measure groups.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 4-17

Module Review and Takeaways


In this module, you have learned how to create and configure measure groups in a cube, and define
relationships between measure groups and dimensions.

Review Question(s)
Question: Give an example of scenarios where you would use an additive measure, a
semiadditive measure, and a nonadditive measure.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
5-1

Module 5
Introduction to MDX
Contents:
Module Overview 5-1

Lesson 1: MDX Fundamentals 5-2

Lesson 2: Adding Calculations to a Cube 5-6

Lesson 3: Using MDX to Query a Cube 5-13

Lab: Using MDX 5-18

Module Review and Takeaways 5-21

Module Overview
Multidimensional Expressions (MDX) is the query language you use to work with and retrieve
multidimensional data in Microsoft SQL Server 2014 Analysis Services (SSAS). This module describes
the fundamentals of MDX and explains how to build calculations, such as calculated members and named
sets.

Objectives
After completing this module, you will be able to:

Describe MDX.

Add calculations to a cube.

Describe how to use MDX in client applications.


MCT USE ONLY. STUDENT USE PROHIBITED
5-2 Introduction to MDX

Lesson 1
MDX Fundamentals
You can use MDX to query multidimensional data or to create MDX expressions for use within a cube.
However, you must first understand the fundamentals of MDX. This lesson describes MDX. It explains what
MDX is, discusses basic MDX query syntax, and explains how to specify query and slicer axes. Finally, this
lesson discusses the process involved in establishing cube context.

Lesson Objectives
After completing this lesson, you will be able to:

Describe MDX.

Explain basic MDX query syntax.

Specify query and slices axes.

Establish the cube context.

What Is MDX?
MDX is an industry-standard specification that has
been adopted by a wide range of Online Analytical
Processing (OLAP) vendors. MDX was specifically
created to query OLAP cubes as, using Structured
Query Language (SQL), expressions would be long
and complex because SQL is designed to navigate
two-dimensional tables. With MDX, it is
straightforward to navigate multidimensional
cubes as well as up, down, and across hierarchies.

MDX consists of two parts: MDX statements and


MDX expressions. MDX statements return record
sets, and client applications use them to populate
the application seen by the user. MDX expressions return single values and are used in the cube to create
values such as calculated or default members. In most scenarios in SSAS, you will use MDX expressions,
but you might use MDX statements in client applications or to define subsets of cube data.

Terms that occur frequently when describing MDX statements and expressions include cell, tuple, and set.
A cell is the intersection between a member of the measures dimension and a member of one or more
other dimensions from which you can obtain data. To identify or extract such data, whether it is a single
cell or a block of cells, MDX uses a system called tuples. A set is a collection of tuples that defines multiple
cells from the same dimensions.

MDX has many uses when you work with multidimensional cubes. In the last illustration on the slide, you
can see an example where MDX is used to limit the cells that can be read by members of a role. Later in
this module, you will also see MDX used in SQL Server Management Studio, Microsoft Excel, and SQL
Server Reporting Services.

MDX is not limited to multidimensional cubes. You can also use MDX with tabular data models alongside
the new Data Analysis Expressions (DAX) language, which is designed specifically for tabular data model
use.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 5-3

Basic MDX Query Syntax


The basic MDX query is the SELECT statement. This
is the most frequently used query in MDX and you
use it to retrieve data from a cube.

In MDX, the SELECT statement specifies a result set


that contains a subset of multidimensional data
that has been returned from a cube.
To identify the query axesthe cube that sets the
context of the query, and the slicer axisthe MDX
SELECT statement uses the following clauses:
A SELECT clause. This determines the query
axes of the displayed result set by using an
MDX SELECT statement.
A FROM clause. This determines which multidimensional data source to use when extracting data to
populate the result set of the MDX SELECT statement.

A WHERE clause. This optionally determines which dimension or member to use as the slicer axis that
restricts the extracting of data to a specific dimension or member.

Bracketing Object Names


So that MDX can interpret names, you need to put square brackets around the name of any object from
an OLAP databasea cube, dimension, hierarchy, level, or memberin the following three cases:

If the name contains a space or other special character. For example, you must refer to the member
Gross Profit in MDX as [Gross Profit] because the name contains a space.

If the name is the same as a keyword, such as an MDX function. For example, SELECT is an MDX
keyword, so if you have a level named SELECT, you must refer to the level as [Select].

If the name begins with a numeric character. For example, you must refer to the member 2011 as
[2011].

If a name does not meet any of these conditions, you can still put brackets around the name, but they are
not required.

Qualifying Names to Avoid Ambiguity


The same object name from an OLAP database can appear in more than one place in an OLAP cube, even
in the same cube dimension or level. To avoid ambiguity, MDX enables you to add qualifiers to a member
name. The qualifiers ensure that, when you read the MDX statement or expression, you know to which
object the name refers. The following are examples of ambiguous situations and how they would be
qualified:

In a Time dimension, the member Q1 might appear as a child of multiple years. Q1 for each year
would be qualified as follows: [2010].[Q1], [2011].[Q1], and so on.

A cube could have West as a member name in the Region dimension and an employee with the
name West in the Sales Rep dimension. The two members would be qualified as follows:
[Region].[West] and [Sales Rep].[West].

Ensuring Correct Member is Returned


MDX does not return an error if a name is ambiguous, that is, if the name appears in more than one place
in a cube. MDX follows the simple rule of using the first occurrence of the name in the database. For
example:
MCT USE ONLY. STUDENT USE PROHIBITED
5-4 Introduction to MDX

If you use [Q1] as a name in a cube that includes the years 2010 and 2011, MDX interprets the name
as [2010].[Q1] because 2010 is the first occurrence in the Time dimension.

If you use [West] in a cube that includes that member name in both the Region and the Sales Rep
dimensions, MDX interprets the name according to which of the two dimensions appears first in the
Cube Editor.

Note: The slide uses SELECT, FROM, and WHERE clauses, has bracketed object names, and
qualifies names to avoid ambiguity.

Specifying Query and Slicer Axes


Query axes and slicer axes represent a set of
hierarchies from which data is retrieved, and have
the following characteristics:

Query axes specify the edges of a cell set that


an MDX SELECT statement returns.
To explicitly specify a query axis, use the
<SELECT query axis clause>, as the code
examples illustrate.
The slicer axis filters the data that the MDX
SELECT statement returns, restricting the
returned data so that only data that intersects
with the specified numbers will be returned.

To explicitly specify a slicer axis, you use the <SELECT slicer axis clause> in MDX.

If a member from a hierarchy within the cube is not explicitly included in a query axis, the default
member from that hierarchy is implicitly included in the slicer axis.

You can specify axes by using either a name or an ordinal integer identifier. For example, the query in the
following code example uses names to place dates on the column axis, customer geography on the row
axis, and sales amount at the intersections:

SELECT
[Measures].[Sales Amount] On COLUMNS,
NonEmpty([Product Category].[Product Category Key].Members, [Measures].[Sales Amount]) On ROWS
FROM [Adventure Works]

The following code example performs the same query by using ordinal integer identifiers:

SELECT
[Measures].[Sales Amount] On 0,
NonEmpty([Product Category].[Product Category Key].Members, [Measures].[Sales Amount]) On 1
FROM [Adventure Works]

You should define a default member before you work with slicer axes. The default member of an attribute
hierarchy is used to evaluate expressions when an attribute hierarchy is not included in a query. To avoid
name resolution problems, define the default member in the cube's MDX script in the following situations:

If the cube refers to a database dimension more than once.

If the dimension in the cube has a different name from the dimension in the database.

If you want to have different default members in different cubes.


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 5-5

You should be aware that an MDX query cannot skip query axes. That is, a query that includes one or
more query axes must not exclude lower-numbered or intermediate axes. For example, a query cannot
have a ROWS axis without a COLUMNS axis or have COLUMNS and PAGES axes without a ROWS axis.

Establishing Cube Context


Every MDX query runs within a specified cube
context. This context defines the members that are
evaluated by the expressions within the query.

In the SELECT statement, the FROM clause


determines the cube context. Although the FROM
clause specifies the cube context as within a single
cube, you can still work with data from more than
one cube at a time.

You can specify a complete cube and a subcube.


Initially, you create the subcube then any query
against the cube in that session will be limited to
the subcube. The following MDX code example
limits the Budget cube to only two accounts for the session:

CREATE SUBCUBE Budget AS SELECT {[Account].[Account].&[4200], [Account].[Account].&[4300] } ON 0 FROM


Budget

Note: The subcube has the same name as the cube.

Note: The accounts 4200 and 4300 are preceded by an ampersand signifying that the value
is a key rather than a name.

You can also use the LOOKUPCUBE and FILTER functions to refine the cube context. The LOOKUPCUBE
function enables you to retrieve data from outside the cube. For example, the MDX statement in the
following code example retrieves data from the AdventureWorksArchive cube even though the current
context is AdventureWorks:

WITH MEMBER MEASURES.LOOKUPCUBEARCHIVE AS


LOOKUPCUBE("AdventureWorksArchive", "[Measures].[Internet Sales Amount]")
SELECT MEASURES.LOOKUPCUBEARCHIVE ON 0
FROM [Adventure Works]

Note: The LOOKUPCUBE function is likely to cause poor performance and, if you use it
often, you should consider a cube redesign.

The FILTER function filters the results based on a search condition. The following code example filters the
results to only display records with a Sales Value greater than 1000:

SELECT [Measures].[Sales Amount] ON 0,


FILTER([Date].[Date].[Date].MEMBERS, [Measures].[Sales Amount]>1000)
ON 1
FROM [Adventure Works]
MCT USE ONLY. STUDENT USE PROHIBITED
5-6 Introduction to MDX

Lesson 2
Adding Calculations to a Cube
In SSAS, calculations include calculated members and named sets that are combinations of cube data,
arithmetic operators, numbers, and functions. In MDX, a calculated member is a member that is resolved
by calculating an MDX expression to return a value. A named set is a set of dimension members or
expressions that are grouped together for reuse in MDX queries. The ability to construct and use
calculated members and named sets in an MDX query provides a great deal of manipulation capability for
multidimensional data.

Lesson Objectives
After completing this lesson, you will be able to:

Describe how to add calculations to a cube.

Explain the concepts of calculated members.


Describe named sets.

List useful non-family MDX functions.

List useful family MDX functions.

Define subcubes by using the SCOPE statement.

The Calculations Tab of the Cube Designer


To create a calculated member, use the New
Calculated Member button on the toolbar of the
Calculations tab in Cube Designer. This command
displays a form to specify several options for the
calculated member:

Use the Script Organizer pane in form view


to display the contents of the cube script in an
ordered format.

Use the Calculation Tools pane in both form


and script view to display metadata, functions,
and tools available to the cube.

Use the Script Editor pane in script view to edit the entire cube script and, in form view, edit script
commands.

Use the Calculated Member Form Editor pane in form view to edit calculated members in the cube
script.

Use the Named Set Form Editor pane in form view to edit named sets in the cube script.

The Calculated Member Builder is the principal interface for creating calculated members by using
MDX. The interface includes several regions:

The top region contains three boxes that enable you to change the name, dimension, and location for
the new member:

o The default dimension for a calculated member is the measures dimension.


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 5-7

o You have the option of selecting any of the cube dimensions in the Parent dimension list. The
parent member does not determine the aggregation of the calculated member, because
calculated members do not aggregate. The parent member merely dictates the placement of the
calculated member in the dimension hierarchy.

The second region contains the Value expression box, where you enter an MDX expression. You type
the expression in the Value expression box, double-click the dimension, or drag function values to
the box.

The third region is the metadata area. On the left is a Data list that includes all the cube dimensions.
A Functions list in the center displays all the MDX functions. On the right are buttons to help you
construct an MDX expression.

The fourth region displays information about the currently selected item in the Data and Functions
lists.

The bottom region contains standard OK, Cancel, and Help buttons.

Calculations are solved in the order listed in the Script Organizer pane. You can reorder calculations by
right-clicking any one and then clicking Move Up or Move Down on the shortcut menu. You can also
click a calculation and then click Move Up or Move Down on the toolbar.

For best performance with cell calculations, specify only a single member when possible. The Non-empty
behavior option stores the names of measures that are used to resolve NON EMPTY queries in MDX. If
this property is blank, the calculated member must be evaluated repeatedly to determine whether a
member is empty. If this property contains the name of one or more measures, the calculated member is
treated as empty if all specified measures are empty. This property is an optimization hint to SSAS to
return only records that are not NULL. Returning only records that are not NULL improves the
performance of MDX queries that use the NON EMPTY operator, the NonEmpty function, or require the
calculation of cell values.

Calculated Members
A calculated member is a member whose value is
calculated at run time by using an MDX expression
that you specify when you define the calculated
member. A calculated member is available to
Business Intelligence (BI) applications just like any
other member.

Calculated members do not increase the size of


the cube because only the definitions are stored in
the cube; values are calculated in memory to
answer a query as required.

Although calculated members are typically based


on data that already exists in the cube, you can
create complex expressions by combining data with arithmetic operators, numbers, and functions.

You can define a calculated member with one of two contexts:

Query-scoped. A calculated member that is defined as a part of an MDX query and whose scope is
limited to that query.

Session-scoped. A calculated member whose scope is wider than the context of the query and whose
scope is the lifetime of the MDX session.
MCT USE ONLY. STUDENT USE PROHIBITED
5-8 Introduction to MDX

There are two ways to create calculated members:

For a single MDX query, you can define the calculated member by using the WITH keyword.

For a calculated member that is available throughout an MDX session, use the CREATE MEMBER
statement.

You should be aware that, because calculated members perform calculations on cube data, they are
executed at run time. Therefore, extensive use of calculated members can cause query execution times to
increase and processing times to decrease.

The following code example creates two calculated members. The first is named Total Revenue, and adds
the Internet Revenue measure to the Reseller Revenue measure. The second is named Reseller GPM,
and calculates the gross profit margin for reseller sales from the Reseller Sales measure and the Reseller
Cost measure:

Using MDX to Define Calculated Members


CREATE MEMBER CURRENTCUBE.[Measures].[Total Revenue]
AS [Measures].[Internet Revenue]
+[Measures].[Reseller Revenue],
FORMAT_STRING = "Currency",
NON_EMPTY_BEHAVIOR = { [Internet Revenue],
[Reseller Revenue] },
VISIBLE = 1 ;
CREATE MEMBER CURRENTCUBE.[Measures].[Reseller GPM]
AS ([Measures].[Reseller Revenue]
-[Measures].[Reseller Cost])
/[Measures].[Reseller Revenue],
FORMAT_STRING = "Percent",
NON_EMPTY_BEHAVIOR = { [Reseller Revenue] },
VISIBLE = 1 ;

Named Sets
A named set is an MDX expression that returns a
set of dimension members. You can define named
sets and save them as part of the cube definition;
you can also create named sets in client
applications.
Named sets have a number of useful features:

You can use named sets in MDX queries in


client applications and you can also use them
to define sets in subcubes.

Named sets simplify MDX queries and provide


useful aliases for complex, typically used, set
expressions.

There are two ways to define named sets:

To create a named set for a single MDX query, you can define that named set by using the WITH
keyword.

To create a named set that is available throughout an MDX session, use the CREATE SET statement. A
named set that is created by using the CREATE SET statement will not be removed until after the MDX
session closes.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 5-9

When you use the CREATE SET statement, you can specify that the set should be static or dynamic. Static
sets are evaluated only once, when the CREATE SET statement is executed. Dynamic sets are evaluated
each time they are used in a query.

The following code example defines a dynamic named set that includes all products in the Bikes
category:

Using MDX to Create a Named Set


CREATE DYNAMIC SET CURRENTCUBE.[Core Products]
AS [Product Category].[English Product Category Name].&[Bikes] ;

Demonstration: Defining Cube Calculations


In this demonstration, you will see how to:

Create a Calculated Member.


Create a Named Set.

Browse Custom Calculations.

Demonstration Steps
Create a Calculated Member

1. Ensure that the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then
log on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2. In the D:\Demofiles\Mod05 folder, run Setup.cmd as Administrator.

3. Start Visual Studio and open Demo.sln in the D:\Demofiles\Mod05 folder.

4. On the Build menu, click Deploy Solution. If you are prompted for impersonation credentials, enter
the password Pa$$w0rd for ADVENTUREWORKS\ServiceAcct.

5. In Solution Explorer, double-click SalesDemo.cube to open it in the cube designer. Then click the
Calculations tab.

6. On the Cube menu, click New Calculated Member.

7. Create a calculated member with the following properties:

o Name: Profit
o Parent hierarchy: Measures

o Expression: [Measures].[Revenue]-[Measures].[Cost]

o Format string: "Currency"

o Visible: True

o Non-empty behavior: Revenue

o Associated measure group: Reseller Sales

8. On the Cube menu, point to Show Calculations in, and click Script to view the MDX script for the
cube, and note the CREATE MEMBER statement that has been generated from the properties in the
form.
9. On the File menu, click Save All.
MCT USE ONLY. STUDENT USE PROHIBITED
5-10 Introduction to MDX

Create a Named Set

1. On the Cube menu, point to Show Calculations in, and click Form to return to the form view.

2. On the Cube menu, click New Named Set.

3. In the Name box, change the name of the new named set to [Speciality Stores].

4. In the Calculation Tools pane, on the Metadata tab, expand Reseller, expand Business Type,
expand Members, and then expand All.

5. Drag Specialty Bike Shop into the Expression box.

6. In the Type list, select Static.

7. On the Cube menu, point to Show Calculations in, and click Script to view the MDX script for the
cube, and note the CREATE SET statement that has been generated from the properties in the form.

8. On the File menu, click Save All.

Browse Custom Calculations

1. On the Build menu, click Deploy Solution. If you are prompted for impersonation credentials, enter
the password Pa$$w0rd for ADVENTUREWORKS\ServiceAcct.

2. In Solution Explorer, right-click SalesDemo.cube and click Browse.

3. In the Metadata pane, expand Measures, expand Reseller Sales, and drag Profit to the query results
area. The total profit for all sales is shown.
4. In the Metadata pane, expand Reseller, and drag Specialty Store to the Hierarchy column in the
filters area. The results are filtered to show profit for the resellers in the Specialty Stores named set.

5. Close Visual Studio.

Useful MDX Functions (Non-Family Functions)

CURRENTMEMBER
The CURRENTMEMBER function returns the
current member of a dimension. You typically use
this with another function to return a property of
the current member. For example, the MDX
expression in the following code example returns
the name of the current member in the Product
dimension:

Product.CurrentMember.Name

NAME
As you can see in the preceding code example, the NAME function returns the name of the current
member, but it has other uses. The NAME function can return the name of a dimension, level, or member.
The syntax is exactly the same and depends entirely upon the object referenced before the NAME
function.

DIMENSION, HIERARCHY, and LEVEL


The DIMENSION, HIERARCHY, and LEVEL functions return the current dimension, hierarchy, or level. This
is often followed by the NAME function to return the name of the object.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 5-11

PARALLELPERIOD
The PARALLELPERIOD function returns a member from the previous period with the same relative
expression. Therefore, if you use the parallel period of March 2011 at the Year level and define a lag of 2
to go back two years, it would return March 2009.

NONEMPTY
NONEMPTY is a useful function because it returns the rows that are not empty from a set, as in the
following code example:

NonEmpty([Product Category].[Product Category Key].Members,[Measures].[Sales Amount])

Note: For more information about MDX functions, go to Microsoft SQL Server 2008
MDX Step by Step by Bryan C. Smith and others (ISBN-10: 0735626189) or the Multidimensional
Expressions (MDX) Reference, at http://go.microsoft.com/fwlink/?LinkID=246786.

Useful MDX Functions (Family Functions)


Relationships in an OLAP database can be
described between different members in the same
dimension by using terms that are analogous to a
family tree. Understanding existing relationships is
important and MDX contains a wide range of
functions that affect relationships between the
members within a dimension.
Examples include:

The PARENT function returns the member


from one level above the current member, as
in the following code example:

Product.CurrentMember.Parent.Name

In the previous example, the PARENT function of the member San Francisco would return CA.

The ANCESTOR function returns the member from a specified level or number of levels above the
current member. For example, you could use this to return the parent, and this code example would
return the same value as the previous example:

Ancestor(Product.CurrentMember,1).Name

The CHILDREN function returns all of the members in the level below the current member. The
FIRSTCHILD and LASTCHILD functions return the first or last members respectively from the level
below the current member. For example, the first child of CA in the example would be Los Angeles,
and the last child of USA would be WA.

The FIRSTSIBLING and LASTSIBLING functions return the first or last members respectively from the
level of the current member. For example, the first sibling of San Diego in the example would be Los
Angeles and the last sibling of Los Angeles would be San Francisco.
MCT USE ONLY. STUDENT USE PROHIBITED
5-12 Introduction to MDX

Note: For more information about MDX functions, go to Microsoft SQL Server 2008
MDX Step by Step by Bryan C. Smith and others (ISBN-10: 0735626189) or the Multidimensional
Expressions (MDX) Reference, at http://go.microsoft.com/fwlink/?LinkID=246786.

Scoped Assignments
The SCOPE statement limits the scope of specified
MDX statements to a specified subcube.

A scoped assignment uses the SCOPE command to


define a subcube where you can then apply MDX
statements, including the CALCULATE statement.
You can use the THIS function to refer to the
subcube.

SCOPE statements will create subcubes that


expose "holes" in ragged dimensions regardless of
the MDX Compatibility setting. For example, the
statement SCOPE(Customer.State.members) can
include the states in countries or regions that do
not contain states, but for which otherwise invisible placeholder members were inserted.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 5-13

Lesson 3
Using MDX to Query a Cube
So far this module has looked at MDX in SSAS, but many MDX statements are used in client applications
to execute queries against the multidimensional cube. This lesson describes how to use MDX to query a
cube, and how to troubleshoot query performance.

Lesson Objectives
After completing this lesson, you will be able to:

Use MDX to query a cube.

Describe how Analysis Services processes MDX queries.


Troubleshoot query performance.

Using MDX in Client Applications


Many client applications and BI tools submit MDX
queries to cubes in Analysis Services databases. In
most cases, specialist BI tools generate the MDX
for the user but, if you are involved in developing
custom reports or applications, you may need to
write some MDX queries.
Most MDX queries submitted by client
applications consist of a SELECT statement that
retrieves a specific set of cells from the cube based
on axes and slicers defined in the query.

The following code example returns the


aggregated Reseller Revenue measure for each
year and product category:

Querying a Cube with MDX


SELECT
NONEMPTY([Order Date].[Calendar Date].[Calendar Year].MEMBERS) ON COLUMNS,
[Product].[Product Category Name].[All].CHILDREN ON ROWS
FROM [Sales]
WHERE [Measures].[ Reseller Revenue]

SQL Server Management Studio provides an easy-to-use environment for testing MDX queries. To query
Analysis Services cubes in SQL Server Management Studio, connect to the SSAS instance and create a new
MDX query. SQL Server Management Studio is intended purely for testing purposes, not as a user
interface. Users are likely to consume the results of MDX queries as Reporting Services reports or Excel
spreadsheets.
MCT USE ONLY. STUDENT USE PROHIBITED
5-14 Introduction to MDX

How Analysis Services Processes Queries


Analysis Services uses the following subsystems to
process queries:

Session Manager. This manages user


sessions, and integrates with Security Manager
to authenticate users.

Query Processor. This is responsible for


generating query execution plans and
applying the necessary formulas and
calculations to create the requested result set.
Query execution plans are cached for faster
execution in subsequent queries.

Storage Engine. This retrieves the data required to satisfy the query. Whenever possible, data is
cached for faster retrieval in subsequent queries.

Query execution events


During query execution, a sequence of events occurs as the process progresses through the subsystems.
The specific events that occur can vary between queries, but the following general pattern is typical of an
Analysis Services query execution:

1. A new session is initialized in the session manager.

2. A query is started.

3. The query starts the process of extracting data from a cube.

4. The query is broken down into subcube queries, which are used to retrieve data from the storage
engine.

5. If the data model includes aggregations that satisfy the subcube query, the aggregations are
retrieved. Otherwise, more granular data must be retrieved and aggregated.
6. If the required data is cached, the subcube queries obtain data from the cache. Otherwise, the data is
retrieved from the stored dimension or measure group.

7. At the end of each subcube query, results are passed to the query processor that then begins to
serialize the results, applying any additional aggregations, sort operations, or other calculations as
required.

8. When the results are serialized, the cube query ends.

9. When the query ends, the results are passed to the user session.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 5-15

Troubleshooting Query Performance


You can use SQL Server Profiler to trace events
that occur during query execution and determine
which steps in the overall query process take the
most time to complete. You can then use this
information to identify steps in the process that
will benefit most from further investigation and
optimization. Each data model and query has its
own characteristics, but you can use the following
guidelines to start troubleshooting query
performance issues:

If the process spends more time in the query


processor than the storage engine, consider
optimizing the MDX or DAX query to reduce the number of calculations being performed.

If the query process spends more time in the storage engine than the query processor, consider
creating partitions in the data model, and defining attribute relationships in multidimensional
hierarchies.

If the query process spends more time in the storage engine than the query processor, and data is
seldom retrieved from aggregations, consider optimizing the aggregations in the cube based on
usage.

If the query process spends more time in the storage engine than the query processor, but data is
rarely retrieved from cache, investigate the memory resources, utilization, and configuration.

Additional Reading: For more information about diagnosing query performance


problems, go to Analysis Services Performance Guide at http://www.microsoft.com/en-
us/download/confirmation.aspx?id=17303.

Demonstration: Monitoring MDX Query Execution


In this demonstration, you will see how to monitor and troubleshoot MDX query performance.

Demonstration Steps
Monitor Analysis Services Query Processing

1. Ensure that you have completed the previous demonstration in this module.

2. Start SQL Server Management Studio, and connect to the MIA-SQL Analysis Services instance.

3. In Object Explorer, right-click MIA-SQL and click Restart. When prompted to allow the program to
make changes, click Yes, and when prompted to confirm the restart action, click Yes. Wait for
Analysis Services to restart.

4. Start SQL Server Profiler, and on the File menu, click New Trace.

5. When prompted, use Windows authentication to connect to the MIA-SQL instance of Analysis
Services.

6. In the Trace Properties dialog box, in the Trace name box, type Analysis Services Query Trace.
MCT USE ONLY. STUDENT USE PROHIBITED
5-16 Introduction to MDX

7. On the Events Selection tab, select Show all events, and then clear the Events check box in all rows
other than the following:

o Progress Report Begin

o Progress Report End

o Query Begin
o Query End

o Query Cube Begin

o Query Cube End

o Query Subcube

o Serialize Results Begin

o Serialize Results End


8. Clear the Show all events check box, and then select Show all columns. Clear the selected check
boxes in all columns other than the following (to clear all check boxes in a column, right-click the
column header and click Deselect column):
o EventSubclass

o TextData

o ApplicationName
o Duration

o DatabaseName

o ObjectName

o SPID

o CPUTime

9. Clear the Show all columns check box.

10. Click Column Filters, and in the Edit Filter dialog box, select DatabaseName, expand Like, type
Demo, and then click OK.

11. Click Run, and then minimize SQL Server Profiler.


12. Start the Performance Monitor administrative tool, and in the pane on the left, if necessary, expand
Performance and Monitoring Tools, and then click Performance Monitor.

13. In the pane at the bottom, select each counter in turn, and on the toolbar, click Delete (the red X
icon) until there are no counters displayed.

14. On the toolbar, click Add (the green + icon).

15. In the Add Counters dialog box, in the list of objects, expand the MSAS12: MDX object, click Total
cells calculated, click Add, and then click OK.

16. On the toolbar, in the Change graph type drop-down list, click Report. Total cells calculated
should currently have the value 0.000.

17. Minimize Performance Monitor.


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 5-17

Troubleshoot an MDX Query

1. In SQL Server Management Studio, open MDX Query.mdx in the D:\Demofiles\Mod05 folder. This
query ranks cities by reseller sales revenue.

2. In the Available Databases drop-down list, select Demo.

3. Click Execute, and wait for the query to return results.


4. Maximize SQL Server Profiler, and on the File menu, click Stop Trace. Maximize Performance
Monitor, and on the toolbar, click Freeze Display.

5. In SQL Server Profiler, view the trace, and note the Duration value for the last Query Subcube event
(which represents the time spent retrieving the cube from the storage engine), and the Duration
value for the last Serialize Results End event (which represents the time spent manipulating the data
after it was retrieved from storage).
6. In Performance Monitor, note the Total cells calculated value. The results indicate that the query
spent significantly more time manipulating the data than retrieving it from the storage engine, and a
very large number of cells were calculated during the execution of the query. The most appropriate
way to improve the query performance is to optimize the MDX and reduce the number of
calculations being performed.

7. In SQL Server Management Studio, in Object Explorer, right-click the MIA-SQL Analysis Services
instance, and then click Restart. When prompted to allow the program to make changes, click Yes,
and when prompted to confirm the restart action, click Yes. Wait for Analysis Services to restart.

8. Minimize SQL Server Management Studio.

9. In SQL Server Profiler, on the File menu, click Run Trace. Minimize SQL Server Profiler.

10. In Performance Monitor, on the toolbar, click Unfreeze Display. If the Total cells calculated value
does not revert to 0.000, right-click the report, and then click Clear. Minimize Performance Monitor.

11. Maximize SQL Server Management Studio, and open Revised MDX Query.mdx in the
D:\Demofiles\Mod05 folder.

12. In the Available Databases drop-down list, select Demo.


13. Click Execute, and wait for the query to return results.

14. Maximize SQL Server Profiler, and on the File menu, click Stop Trace. Maximize Performance
Monitor, and on the toolbar, click Freeze Display.

15. In SQL Server Profiler, view the trace and note the Duration value for the last Query Subcube event
(which represents the time spent retrieving the cube from the storage engine) and the Duration
value for the last Serialize Results End event (which represents the time spent in the formula
engine).

16. In Performance Monitor, note the Total cells calculated value. The revised version of the query uses
a WITH SET statement to sort the resellers by revenue into a named set before applying the RANK
function. This enables the query processor to use a linear hash scan to find each citys position in the
ordered list, dramatically reducing the number of calculations required to produce the results.

17. Close SQL Server Management Studio, SQL Server Profiler, and Performance Monitor.
MCT USE ONLY. STUDENT USE PROHIBITED
5-18 Introduction to MDX

Lab: Using MDX


Scenario
You are a BI developer for the Adventure Works Cycles company, and you are creating a custom analysis
application for the Chief Finance Officer (CFO). The application must return sales data for products,
customers, and dates from the Adventure Works Cube, so you have initially decided to experiment with
MDX syntax to query the cube and retrieve business data. The CFO has also asked you to extend the cube
to include measures for profit and gross profit margin. You must create calculated members for these and
ensure you can include them in queries from the custom application you plan to build.

Objectives
After completing this lab, you will be able to:

Create calculated members.

Use MDX to query a cube.

Estimated Time: 30 minutes


Virtual machine: 20466C-MIA-SQL

User name: ADVENTUREWORKS\Student

Password: Pa$$w0rd

Exercise 1: Creating Calculated Members


Scenario
The CFO has also asked you to extend the cube to include measures for Internet Profit and Reseller Profit.
You must create calculated members for these and ensure that you can include them in queries from the
custom application you plan to build.
The main tasks for this exercise are as follows:

1. Prepare the Lab Environment

2. Use Form View to Create a Calculated Member


3. Use Script View to Create a Calculated Member

Task 1: Prepare the Lab Environment


1. Ensure that the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then
log on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. Run Setup.cmd in the D:\Labfiles\Lab05\Starter folder as Administrator.

Task 2: Use Form View to Create a Calculated Member


1. Use Visual Studio to open and deploy the Adventure Works OLAP.sln solution in the
D:\Labfiles\Lab05\Starter folder.

2. Add a calculated member named Internet Profit to the Sales cube:

o Use the following expression:

[Measures].[Internet Revenue]- [Measures].[Internet Cost]

o Format the calculated member as currency.

o Select the Internet Cost and Internet Revenue measures in the Non-Empty behavior list.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 5-19

o Associate the calculated member with the Internet Sales measure group.

3. Save the solution when you have created the calculated member.

Task 3: Use Script View to Create a Calculated Member


1. Use the following MDX script to create a calculated measure named Reseller Profit:

CREATE MEMBER CURRENTCUBE.[Measures].[Reseller Profit]


AS [Measures].[Reseller Revenue]-[Measures].[Reseller Cost],
FORMAT_STRING = "Currency",
NON_EMPTY_BEHAVIOR = { [Reseller Cost], [Reseller Revenue] },
VISIBLE = 1,
ASSOCIATED_MEASURE_GROUP = 'Reseller Sales' ;

2. Save and deploy the solution when you have created the calculated member. Then close Visual
Studio.

Results: After this exercise, you should have created three calculated members.

Exercise 2: Querying a Cube by Using MDX


Scenario
You have created calculated members in the cube, and now you plan to use MDX queries to test them.

The main tasks for this exercise are as follows:

1. Write simple MDX queries

2. Write an MDX Query to Return Data on Rows and Columns

Task 1: Write simple MDX queries


1. Start SQL Server Management Studio and connect to the MIA-SQL instance of Analysis Services.

2. Execute the following MDX query to view Internet profit for each calendar year:

SELECT [Measures].[Internet Profit] ON 0,


NONEMPTY([Order Date].[Calendar Date].[Calendar Year].MEMBERS) ON 1
FROM [Sales];

3. Modify the query to match the following code, and execute it to view both Internet profit and reseller
profit for each calendar year:

SELECT { [Measures].[Internet Profit], [Measures].[Reseller Profit] } ON 0,


NONEMPTY([Order Date].[Calendar Date].[Calendar Year].MEMBERS) ON 1
FROM [Sales];

Task 2: Write an MDX Query to Return Data on Rows and Columns


1. Use the following MDX query to return the reseller profit with calendar years on columns and sales
territory groups on rows:

SELECT NONEMPTY([Order Date].[Calendar Date].[Calendar Year].MEMBERS) ON COLUMNS,


NONEMPTY([Sales Territory].[Sales Territory].[Sales Territory Group].MEMBERS) ON ROWS
FROM [Sales]
WHERE [Measures].[Reseller Profit];
MCT USE ONLY. STUDENT USE PROHIBITED
5-20 Introduction to MDX

2. Modify the query to return reseller profit by calendar year and product category, as shown in the
following code:

SELECT NONEMPTY([Order Date].[Calendar Date].[Calendar Year].MEMBERS) ON COLUMNS,


NONEMPTY([Product].[Categorized Products].[Category].MEMBERS) ON ROWS
FROM [Sales]
WHERE [Measures].[Reseller Profit];

3. When you are finished, close SQL Server Management Studio without saving any items.

Results: After this exercise, you should have written MDX queries to return data from the Sales cube.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 5-21

Module Review and Takeaways


In this module, you have learned how to use MDX to create calculated members and named sets; and
how to write MDX queries that return data from a cube.

Review Question(s)
Question: In what scenarios might you need to create your own MDX queries?
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
6-1

Module 6
Enhancing a Cube
Contents:
Module Overview 6-1

Lesson 1: Working with Key Performance Indicators 6-2

Lesson 2: Working with Actions 6-7

Lesson 3: Working with Perspectives 6-11

Lesson 4: Working with Translations 6-13

Lab: Customizing a Cube 6-15

Module Review and Takeaways 6-19

Module Overview
In this module, you will learn how to customize cube functionality by using several technologies available
to you in Microsoft SQL Server 2014 Analysis Services. Cubes function perfectly well without
customization but customization provides enhanced user interfaces, automation, and translation.

You will learn about Key Performance Indicators (KPIs) and how they can be used to obtain a quick and
accurate historical summary of business success. You will find out about actions and how they allow end
users to go beyond traditional analysis to initiate solutions to discovered problems and deficiencies. This
module also covers perspectives and how they allow users to see the cube more simply and focus on the
most relevant data. Finally you will learn about translations, and how they can be used to translate various
elements of a cube enable global users to view and understand cube and dimension data.

Objectives
After completing this module, you will be able to:

Implement Key Performance Indicators.

Implement Actions.

Implement Perspectives.

Implement Translations.
MCT USE ONLY. STUDENT USE PROHIBITED
6-2 Enhancing a Cube

Lesson 1
Working with Key Performance Indicators
In business terminology, a Key Performance Indicator (KPI) is a quantifiable measurement for gauging
business success, frequently evaluated over time. For example, an organizations sales department may
use monthly gross profit as a KPI, but the human resources department may use quarterly employee
turnover. Each is an example of a KPI. Executives frequently consume KPIs grouped together in a business
scorecard to obtain a quick and accurate historical summary of business success.

Lesson Objectives
After completing this lesson, you will be able to:

Describe KPIs.

List the elements of a KPI.

Browse a KPI.

Introducing KPIs
In Microsoft SQL Server Analysis Services, a KPI
is a collection of calculations, which are associated
with a measure group in a cube, and is used to
evaluate business success. The KPI measures values
in the cube against a goal and can also help to
assess values over time to provide trend
information.

To provide at-a-glance company performance to


business users, there is an MDX expression that
returns a status between -1 (worst) and +1 (best).
Presentation of this data is ultimately up to the
client application but status and trend indicators
can be set to suggest the graphical output. For example, if you have created a company performance
dashboard to display KPI data on a SharePoint site, a status of -1.0 to -0.34 might display a red traffic
light, a status of -0.33 to +0.33 an amber traffic light and a status of +0.34 to +1.0 a green traffic light.

You must ensure that business rules are followed to assign the status value because an unsuitable MDX
formula could result in a department, or the company as a whole, being incorrectly assigned a successful,
or unsuccessful, status.

One key advantage of KPIs in Analysis Services is that they are server-based and consumable by different
client applications. If a KPI is client-based it must be recreated for each client application and, potentially,
the results could differ if the implementations are not identical.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 6-3

Elements of a KPI
A simple KPI object is composed of a name, a
description, the goal, the actual value achieved, a
status value, a trend value, a status indicator, a
trend indicator, and a folder where the KPI is
viewed.

Value
A value expression is a physical measure such as
Sales, a calculated measure such as Profit, or a
calculation that is defined within the KPI by using
a Multidimensional Expressions (MDX) expression.

Goal
A goal expression is a value, or an MDX expression that resolves to a value, defining the target for the
measure that the value expression describes. For example, the following goal expression for a Profit
measure has a value of 0.6 (or 60%) for Accessories and 0.3 (30%) for all other categories:

Case
When [Product].[Category].CurrentMember Is
[Product].[Category].[Accessories]
Then .60
Else .30
End

Status
A status expression is an MDX expression that Analysis Services uses to evaluate the current status of the
value expression compared to the goal expression, with a result in the range of -1 to +1. The worst result
is -1, and the best +1. The status expression displays with a graphic to help you easily determine the
status of the value expression compared to the goal expression. For example, the following code returns a
value of 1 if you achieve the goal, 0 if you are at least halfway to achieving it, and -1 if you are less than
halfway:

Case
When KpiValue( "Gross Profit Margin" ) /
KpiGoal ( "Gross Profit Margin" ) >= 1
Then 1
When KpiValue( "Gross Profit Margin" ) /
KpiGoal ( "Gross Profit Margin" ) < 0.5
Then -1
Else 0
End

Trend
A trend expression is an MDX expression that Analysis Services uses to evaluate the current trend of the
value expression compared to the goal expression. The trend expression helps the business user to quickly
determine whether the value expression is getting better or worse relative to the goal expression.
MCT USE ONLY. STUDENT USE PROHIBITED
6-4 Enhancing a Cube

You can associate one of several graphics with the trend expression to help business users quickly
understand the trend. For example, the following code returns a value of 1 if the Gross Profit Margin is
increasing over the last year, -1 if it is decreasing, and 0 if it is the same:

Case
When KpiValue( "Gross Profit Margin" ) >
(KpiValue( "Gross Profit Margin" ),
ParallelPeriod
( [Date].[Fiscal Time].[Fiscal Year], 1,
[Date].[Fiscal Time].CurrentMember ))
Then 1
When KpiValue( "Gross Profit Margin" ) <
(KpiValue( "Gross Profit Margin" ),
ParallelPeriod
( [Date].[Fiscal Time].[Fiscal Year], 1,
[Date].[Fiscal Time].CurrentMember ))
Then -1
Else 0
End

Additional properties of the KPI include:

Display Folder. This is where the KPI will appear if you are browsing the cube.

Parent KPI. This property defines the parent of the current KPI. The browser displays the KPI as a
child while allowing the parent to access the values of the child. This feature enables you to have KPIs
based on other KPIs. You can also apply a weight to adjust the importance of a child KPI against its
siblings.
Current Time Member. This is an MDX expression that defines the current time member for the KPI.

Browsing KPIs
A key advantage of KPIs in Analysis Services is that
they are server-based and consumable by different
client applications. Being server-based enables all
the logic to be built in Analysis Services and stored
with the cube. However, it is possible for client
applications to apply the same logic against the
cube data using MDX expressions. You can use
MDX functions in client applications to retrieve
individual sections of the KPI, such as the value or
goal, for use in MDX expressions, statements, and
scripts.

Note that the indicators defined in SQL Server


Data Tools may or may not be implemented by client applications. Third party client applications might
not display the indicators you have defined and you should verify that the KPI displays correctly.

The example in the above illustration uses several MDX functions to return the KPI value, goal, status, and
trend for the channel revenue measure of the Adventure Works database for descendants of three
members of the Fiscal Year attribute hierarchy.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 6-5

Demonstration: Creating a KPI


In this demonstration, you will see how to create and browse a KPI in a multidimensional Analysis Services
database.

Demonstration Steps
Create a KPI

1. Ensure that the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then
log on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd. Then in
the D:\Demofiles\Mod06 folder, run Setup.cmd as Administrator.

2. Start Visual Studio and open the Demo.sln solution in the D:\Demofiles\Mod06 folder.

3. On the Build menu, click Deploy Solution. If prompted for impersonation credentials, specify the
password Pa$$w0rd for ADVENTUREWORKS\ServiceAcct.

4. After deployment is complete, in Solution Explorer double-click SalesDemo.cube to open it in the


cubed designer, and then click the Calculations tab.

5. In the Script Organizer, select [Gross Margin], and note that this calculated member calculates gross
margin as a percentage of revenue.

6. In the cube designer, click the KPIs tab. Then, on the Cube menu, click New KPI.
7. In the Name box, type Reseller Margin, and then in the Associated measure group list, click
Reseller Sales.

8. In the Calculation Tools pane, on the Metadata tab, expand Measures, expand Reseller Sales, and
then drag the Gross Margin measure to the Value Expression box.

9. In the Goal Expression box, type 0.1 (this hard-coded value sets a gross margin target of 10%).

10. Verify that Gauge is selected in the Status indicator list, and then type the following MDX expression
in the Status expression box (you can copy this from KPI Status Expression.txt in the demo folder):

Case
When
KpiValue("Reseller Margin")/KpiGoal("Reseller Margin")>=.95
Then 1
When
KpiValue("Reseller Margin")/KpiGoal("Reseller Margin")<.95
And
KpiValue("Reseller Margin")/KpiGoal("Reseller Margin")>=.5
Then 0
Else -1
End
MCT USE ONLY. STUDENT USE PROHIBITED
6-6 Enhancing a Cube

11. Verify that Standard arrow is selected in the Trend indicator list, and then type the following
expression in the Trend expression box (you can copy this from KPI Trend Expression.txt in the
demo folder):

Case
When IsEmpty(ParallelPeriod([Order Date].[Fiscal Date].[Fiscal Year],
1, [Order Date].[Fiscal Date].CurrentMember))
Then 0
When [Measures].[Gross Margin] =
(ParallelPeriod([Order Date].[Fiscal Date].[Fiscal Year],
1, [Order Date].[Fiscal Date].CurrentMember), [Measures].[Gross Margin])
Then 0
When [Measures].[Gross Margin] >
(ParallelPeriod([Order Date].[Fiscal Date].[Fiscal Year],
1, [Order Date].[Fiscal Date].CurrentMember), [Measures].[Gross Margin])
Then 1
Else -1
End

12. On the File menu, click Save All.


Browse a KPI in the Cube Designer

1. On the Build menu, click Deploy Solution. If prompted for impersonation credentials, specify the
password Pa$$w0rd for ADVENTUREWORKS\ServiceAcct.
2. After deployment is complete, in the cube browser, click the KPIs tab. Then on the Cube menu, point
at Show KPIs in and click Browser. The KPI is shown based on the total overall margin.

3. In the top pane of the KPI browser, in the Dimension list, select Reseller Geography. Then, in the
Hierarchy list, click Country-Region, and in the Operator list, select Equal. Finally, in the Filter
Expression list, expand All, select United States, and then click OK.

4. Click anywhere in the KPI Browser pane to update the values for the Reseller Margin KPI. The
updated values indicate the overall margin for sales in the United States.

5. In the top pane of the KPI browser, in the Dimension list, select Order Date. Then, in the Hierarchy
list, click Fiscal Date, and in the Operator list, select Equal. Finally, in the Filter Expression list,
expand All, select 2006, and then click OK.

6. Click anywhere in the KPI Browser pane to update the values for the Reseller Margin KPI. The
updated values indicate the overall margin for sales in the United States in 2006.

Browse a KPI in Excel

1. On the cube designer, click the Browser tab. Then on the Cube menu, click Analyze in Excel. If a
security notice is displayed, click Enable.

2. In Excel, in the PivotTable Fields pane, under Reseller Sales, select Gross Margin. The overall gross
margin is shown in the PivotTable.

3. In the PivotTable Fields pane, under Order Date, drag Order Date.Fiscal Date to the Rows area.
The gross margin for each year is shown.

4. In the PivotTable Fields pane, expand KPIs and expand Reseller Margin. Then select Status and
Trend. The status and trend for the gross margin are indicated in the PivotTable.

5. Close Excel without saving the workbook.

6. Close Visual Studio.


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 6-7

Lesson 2
Working with Actions
In Microsoft SQL Server Analysis Services, an action is a stored MDX statement that can be presented
to and employed by client applications. In other words, an action is a client command defined and stored
on the server.

Lesson Objectives
After completing this lesson, you will be able to:

Describe actions.

List the types of actions.


Implement actions.

Introducing Actions
An action is a stored statement that can be
presented to and employed by client applications,
enabling business users to act upon the outcomes
of their analyses. The action is stored in the
Analysis Services database and it is important to
verify that the client application supports and
correctly carries out the events of the action.

A simple action object is composed of basic


information, the target where the action is to
occur, a condition to limit the action scope, and
the type.

Basic information includes the name and target of


the action, as well as an optional condition.

The target is the location in the cube where the action is to occur and is composed of a target type and
target object.

The target type could be level members, cells, hierarchy, hierarchy members, or others. The target object
is specific to the target type. For example, if the target type is hierarchy, the target object is any one of the
defined hierarchies in the cube.
A condition is a Boolean MDX statement that evaluates to true or false. If the statement is true, the
action is performed, if it is false, the action is not performed.

By saving and reusing actions, end users can go beyond traditional analysis which typically finishes with
presentation of data, and instead initiate solutions to discovered problems and deficiencies, so extending
the business intelligence application beyond the cube.

You can be flexible when creating actions. For example, an action can launch an application, retrieve
information from a database, or open a web page in a browser.

Actions can be configured to be triggered from almost any part of a cube, including dimensions, levels,
members, and cells, or to create multiple actions for the same portion of a cube.
MCT USE ONLY. STUDENT USE PROHIBITED
6-8 Enhancing a Cube

Types of Actions
Actions can be of different types and have to be
created accordingly. Typically, actions pass an
attribute of the current object enabling them to
be dynamic. For example, you could create a URL
action that displays a customers location on a
map. To create this action, you can open Bing
Maps and search for any postal code, and then
click Share. Bing will generate a URL containing
the postal code that you can then remove and
append to the original URL.

The following table lists the types of actions and


how they are used:

Action Type Description

Standard Dataset Returns a dataset to a client application.

Standard Proprietary Performs an operation by using an interface other than those


listed in this table.

Standard Rowset Returns a rowset to a client application.

Standard Statement Runs an OLE DB command.

Standard URL Displays a dynamic webpage in an Internet browser.

Reporting Submits a parameterized URL-based request to a report server


and returns a report to a client application.

Drill-through Returns a drill-through statement as an expression, which the


client executes to return the set of rows representing the
underlying data from the data source of the selected cells of
the cube where the action occurs. For example, you could use
this action on a measure group for sales orders to drill through
to details that contribute to an aggregated total.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 6-9

Building Actions for a Cube


Use the Actions tab of the cube designer to build
actions and specify the following details for
standard actions:

1. Name. Select a name that identifies the


action.

2. Action Target. Select the object to which the


action is attached. For Target type, select
from the following objects:

o Attribute members

o Cells

o Cube

o Dimension members
o Hierarchy

o Hierarchy members

o Level

o Level members

3. Action Content. Select the type of action from the following:

o Dataset. This retrieves a dataset.


o Proprietary. This performs an operation by using an interface other than those listed here.

o Rowset. This retrieves a rowset.

o Statement. This runs an OLE DB command.


o URL. This displays a variable page in an Internet browser.

4. Additional Properties. Select additional action properties from the following:

o Invocation. This specifies how the action is run. Interactive, the default, specifies that the action
is run when a user accesses an object. The possible settings are Batch, Interactive, and On Open.

o Application. This describes the application of the action.

o Description. This describes the action.

o Caption. This provides a caption that is displayed for the action. If the caption is generated
dynamically by an MDX statement, specify True for Caption is MDX.

o Caption is MDX. This indicates that the caption is an MDX expression that, once executed, will
generate the text to be displayed.

To create a new drill-through action, on the Cube menu, click New Drillthrough Action and then specify
the drill-through columns required.
To create a reporting action, on the Cube menu, click New Reporting Action, and then specify the
Server Name, Server Path, and Report Format values.
MCT USE ONLY. STUDENT USE PROHIBITED
6-10 Enhancing a Cube

Note: For Server Path, specify the path to the report on Reporting Services. For example,
AdventureWorks/YearlyInternetSales. For Report Format select HTML5, HTML3, Excel, or PDF
based on the client tool your users are most likely to use to access your report.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 6-11

Lesson 3
Working with Perspectives
Cubes can be very complex objects for users to explore in Microsoft SQL Server Analysis Services. A
single cube can represent the contents of a complete data warehouse, with multiple measure groups in a
cube representing multiple fact tables, and multiple dimensions based on multiple dimension tables. This
cube type can be very complex and powerful, but daunting to users. They may only need to interact with
a small part of the cube to satisfy their business intelligence and reporting requirements.
In Microsoft SQL Server Analysis Services, you can use a perspective to reduce the perceived
complexity of a cube.

Lesson Objectives
After completing this lesson, you will be able to:

Describe perspectives.
Implement perspectives.

Introducing Perspectives
A perspective enables administrators to create
views of a cube, helping users to focus on the
most relevant data. Although cubes are much
more intuitive to navigate than relational
databases, they can become very large with
several measure groups, many measures, and
numerous dimensions. As perspectives do not
cause any additional processing overhead, you can
create them for each view of the data required by
users.

A simple perspective object is composed of basic


information, dimensions, measure groups,
calculations, KPIs, and actions. Each of these properties defines a subset of the cube as follows:

Basic information includes the name and default measure of the perspective.

The dimensions are a subset of the cube dimensions.

The measure groups are a subset of the cube measure groups.

The calculations are a subset of the cube calculations.

The KPIs are a subset of the cube KPIs.

The actions are a subset of the cube actions.

Objects in a cube that are not visible to the user through a perspective can still be directly referenced and
retrieved using XML for Analysis (XMLA), Multidimensional Expressions (MDX), or Data Mining Extensions
(DMX) statements.

Perspectives do not restrict access to objects in a cube and should not be used as such. Instead,
perspectives can provide a better user experience while accessing a cube.
MCT USE ONLY. STUDENT USE PROHIBITED
6-12 Enhancing a Cube

Creating a Perspective
The first column of the Perspectives tab in the
cube designer is Cube Objects, which lists all
objects in the cube.

You can add a perspective to the Perspectives tab


by:

Clicking New Perspective on the Cube menu.

Clicking the New Perspective button on the


toolbar.

Right-clicking anywhere in the pane and


clicking New Perspective on the shortcut
menu.

When you create a perspective, the name is initially Perspective (followed by an ordinal number, starting
with 1, if there is already a perspective named Perspective). You should change this to a meaningful name
so that business users can understand which perspective to use. Then select the measures, dimension
attributes and hierarchies, KPIs, named sets, and calculated members that should be included in the
perspective.

You can remove a perspective by:

Clicking any cell in the column for the perspective you want to delete. Then, on the Cube menu, click
Delete Perspective.

Clicking the Delete Perspective button on the toolbar.

Right-clicking any cell in the perspective you want to delete, and then clicking Delete Perspective on
the shortcut menu.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 6-13

Lesson 4
Working with Translations
Multilanguage support in Microsoft SQL Server Analysis Services is accomplished by using translations.
In this lesson, you will be introduced to the concepts involved in using translations and discuss how to
implement cube and dimension translations.

Lesson Objectives
After completing this lesson, you will be able to:

Describe translations.

Implement cube translations.


Implement dimension translations.

Introducing Translations
A translation is a simple mechanism to change the
displayed labels and captions from one language
to another. Translating elements of a cube into a
different language enables more users to view and
understand the cube's metadata. Translations are
available for all objects in Analysis Services, and
can also be used to display an alternative column.
Therefore, if you have data in multiple languages,
a different column could be displayed for an
attribute in each language.

A simple translation object is composed of a


language ID numberan integer with the
language IDand a translated caption that is the translated text.

In Microsoft SQL Server Analysis Services, a cube translation is a language-specific representation of


the name of a cube object, such as a caption or display folder. There is no automatic translation
functionality so you must provide every required label in each language.

Translations provide server support for client applications that can support multiple languages. For others,
the default language is displayed.

The collation and language information for the client computer is stored in a Locale Identifier (LCID). The
following process takes place when a client connects to the cube:

On connection, the client passes its LCID to the instance of Analysis Services.
The instance uses the LCID to determine which set of translations to use when providing metadata for
Analysis Services objects to each business user.

If an Analysis Services object does not contain the specified translation, the default language is used
to deliver content to the client.
MCT USE ONLY. STUDENT USE PROHIBITED
6-14 Enhancing a Cube

Implementing Cube Translations


In a cube, you can specify translations for the
following objects:

Measure Groups and Measures

Dimensions

KPIs

Named Sets
Calculated Members

To create a cube translation, click the New


Translation button on the toolbar of the
Translations tab in the cube designer. To remove
a translation, select it and then click Delete Translation.

Implementing Dimension Translations


You can specify translations for the following
objects in a dimension:

Dimension

Attributes

Hierarchies and Levels

To create a dimension translation, click the New


Translation button on the toolbar of the
Translations tab in the dimension designer. To
remove a translation, select the translation and
then click Delete Translation.

In addition to specifying translated values for object names, you can also translate attribute members by
specifying another column in the dimension table that contains translated values. Select New Caption
Column to display the Attribute Data Translation dialog box and define a new caption column when
you modify an attribute in the Translation Details grid. For example, you could specify a caption column
of French Month Name for the Month attribute of the Time dimension. You can use the Edit Caption
Column and Delete Caption Column buttons to modify or delete caption columns.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 6-15

Lab: Customizing a Cube


Scenario
Information workers in the sales department at Adventure Works Cycles use the cube you have created to
analyze sales data. Several users have asked to be able to view aggregated sales figures in Microsoft
Excel, and then quickly drill through to see details of specific orders for a given aggregation. Some users
are also finding the cube too complex for their needs. These users typically only want to analyze sales
amount by customer, and would like to view the measures and dimensions needed to support this
requirement. Finally, the company employs a number of senior sales managers who would like to view
cube data and metadata in French, their first language.

Objectives
After completing this lab, you will be able to:

Implement an action.

Implement a perspective.
Implement a translation.

Estimated Time: 45 minutes

Virtual machine: 20466C-MIA-SQL

User name: ADVENTUREWORKS\Student

Password: Pa$$w0rd

Exercise 1: Implementing an Action


Scenario
Information workers in the sales department at Adventure Works Cycles are using the cube you have
created to analyze sales data. Several users want to view aggregated sales figures in Excel, and then
quickly drill through to see details of specific orders for a given aggregation. For example, when viewing
the sales total for a product category in a specific year, the users need to quickly generate a second Excel
worksheet that shows details of the customer, location, date, and individual product for each sale in that
year.

The main tasks for this exercise are as follows:

1. Prepare the Lab Environment

2. Create a Drill-through Action

3. Browse a Drill-through Action

Task 1: Prepare the Lab Environment


1. Ensure that the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then
log on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. Run Setup.cmd in the D:\Labfiles\Lab06\Starter folder as Administrator.

Task 2: Create a Drill-through Action


1. Use Visual Studio to open and deploy the Adventure Works OLAP.sln solution in the
D:\Labfiles\Lab06\Starter folder.

o If prompted for impersonation credentials when deploying the solution, enter the password
Pa$$w0rd for ADVENTUREWORKS\ServiceAcct.
MCT USE ONLY. STUDENT USE PROHIBITED
6-16 Enhancing a Cube

2. Add a drill-through action named Internet Sales Details to the Sales cube.

o The action should use the Internet Sales measure group members.

o The action should return the specified drill-through columns from the following dimensions:
Customer (City and Full Name)
Order Date (Full Date Alternate Key)
Product (English Product Name)
o Set additional properties so that the drill-through action has the caption Drillthrough to Order
Details.

3. Save the solution when you have finished creating the drill-through action.

Task 3: Browse a Drill-through Action


1. Deploy the Adventure Works OLAP solution.

2. Analyze the cube in Excel and view the Internet Revenue measure by the Order Date.Calendar
Date hierarchy and the Product.Categorized Products hierarchy.

3. Right-click the sales amount for Bikes in 2007, point to Additional Actions, and click Drillthrough to
Order Details to test the drill-through action.
4. Close Excel without saving the workbook when you are finished.

Results: After this exercise, you should have defined a drill-through action.

Exercise 2: Implementing Perspectives


Scenario
Some sales users are finding the cube too complex for their requirements. They typically only need to
analyze sales amount by customer, and simply want to view the measures and dimensions required to
support this.

The main tasks for this exercise are as follows:

1. Create Perspectives

2. Browse Perspectives

Task 1: Create Perspectives


1. Add a perspective named Internet Sales to the Sales cube. The perspective should include only cube
objects that are relevant to Internet sales.

2. Add a perspective named Reseller Sales to the Sales cube. The perspective should include only cube
objects that are relevant to reseller sales.

3. Save the solution.

Task 2: Browse Perspectives


1. Deploy the Adventure Works OLAP solution.

2. Use the cube browser in Visual Studio to browse the Internet Sales perspective of the Sales cube.

o Click the ellipses () in the cube selection area above the Metadata pane, to select the
perspective.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 6-17

3. Analyze the cube in Excel to view the Reseller Sales perspective.

4. When you have finished browsing the perspectives, close Excel without saving the workbook.

Results: After this exercise, you should have defined a perspective and browsed the cube using the new
perspective.

Exercise 3: Implementing a Translation


Scenario
The company employs a number of senior sales managers who would like to view cube data and
metadata in French, their first language.

In this exercise you will add the final enhancement to the revenue information cube by specifying
translations for the French speakers in your company.

The main tasks for this exercise are as follows:

1. Create Dimension Translations

2. Create Cube Translations


3. Browse Translations

Task 1: Create Dimension Translations


1. Using the dimension designer for the Date dimension, create a French (France) translation. This
should include translated captions for the following items:

o The Calendar Date hierarchy (Date du Calendrier).

o The Calendar Year level (Anne).

o The Calendar Semester level (Semestre).

o The Calendar Quarter level (Trimestre).

o The Month level (Mois).

o The Day level (Journe)

Tip: To type hold the Alt key and type 130 using the number pad on your keyboard, ensuring Num
Lock is turned on. If this is not possible type the captions without accents.

2. Configure the Month attribute to use FrenchMonthName as a translation column.


o To display hidden attributes, click the Show All Attributes toolbar icon.

o To configure attribute translation columns, click the ellipses () button in the language column
for the attribute.

3. Save the Date dimension when you have finished.

Task 2: Create Cube Translations


1. Using the cube designer for Sales cube, create a new French (France) translation with the following
caption translations:

o The Internet Sales measure group (Ventes d'Internet).

o The Internet Revenue measure (Revenu dInternet).

o The Order Date dimension (Date de Vente).


MCT USE ONLY. STUDENT USE PROHIBITED
6-18 Enhancing a Cube

2. Save the cube when you have finished.

Task 3: Browse Translations


1. Deploy the solution. If prompted for impersonation credentials, enter the password Pa$$w0rd for
ADVENTUREWORKS\ServiceAcct.

2. Use the Browser tab of the cube designer to browse the cube, reconnecting if necessary.

o In the Language list elect French (France).

o View the Revenu dInternet measure by the Date de Vente.Date du Calendrier hierarchy.

3. Verify that the French translations are used for the captions and month attribute values.

Results: After this exercise, you should have specified translations for the time dimension metadata and
the Adventure Works cube metadata, and browsed the cube using the new translations.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 6-19

Module Review and Takeaways


In this module, you have learned how to use KPIs, actions, perspectives, and translations to enhance an
Analysis Services cube.

Review Question(s)
Question: Can you think of example scenarios in which a business would use the cube
enhancements discussed in this module?
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
7-1

Module 7
Implementing an Analysis Services Tabular Data Model
Contents:
Module Overview 7-1

Lesson 1: Introduction to Analysis Services Tabular Data Models 7-2

Lesson 2: Creating a Tabular Data Model 7-7

Lesson 3: Using an Analysis Services Tabular Data Model in the Enterprise 7-16

Lab: Implementing an Analysis Services Tabular Data Model 7-23

Module Review and Takeaways 7-32

Module Overview
You can install Microsoft SQL Server Analysis Services in Tabular mode and create tabular data models
that information workers can access by using tools such as Excel and Power View.

This module describes Analysis Services tabular data models and explains how to develop a tabular data
model using the SQL Server Data Tools for Business Intelligence (BI) add-in for Visual Studio.

Objectives
After completing this module, you will be able to:

Describe Analysis Services tabular data model projects.

Implement an Analysis Services tabular data model.

Use an Analysis Services tabular data model.


MCT USE ONLY. STUDENT USE PROHIBITED
7-2 Implementing an Analysis Services Tabular Data Model

Lesson 1
Introduction to Analysis Services Tabular Data Models
This lesson describes Analysis Services tabular data models and explains how you can use Visual Studio to
create a tabular data model.

Lesson Objectives
After completing this lesson, you will be able to describe:

Analysis Services tabular data models.

The options for creating an Analysis Services tabular data model.

The benefits of using Visual Studio to develop an Analysis Services tabular data model.

The purpose of the workspace database.

What Is a Tabular Data Model?


Tabular data models are in-memory databases
that use the xVelocity storage and processing
engine. The xVelocity engine uses column-based
storage and sophisticated compression algorithms
to deliver very fast query response times, even
with large datasets containing millions of rows of
data. This makes the xVelocity engine ideal for the
complex queries that data analysis and reporting
tools typically generate.
Tabular data models expose data in a relational
format, so the developer interacts with tables and
relationships instead of dimensions and cubes.
They are quick and easy to create compared to multidimensional data models, although they lack some of
the advanced features of multidimensional models. Tabular models are suitable for anything from a
personal desktop BI application that has been developed in Excel, to departmental or larger solutions,
depending upon the complexity of the application.

Tabular models offer two principal advantages over multidimensional models:

The relational model is widely understood and relatively intuitive, so the barrier to entry for relational
database developers and information workers wanting to develop analysis solutions, is lower for
tabular than for multidimensional models. This helps companies to minimize costs by taking
advantage of their existing internal expertise to create data analysis solutions.

Tabular data models are generally simpler in design than multidimensional models, so companies can
achieve a faster time to deployment for BI applications.

Creators of tabular models can use the Data Analysis Expressions (DAX) language to create measures and
calculated columns, and to implement security. DAX is similar to the formulae used in Excel workbooks, so
information workers who already use Excel should find it relatively easy to learn and use.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 7-3

Tabular data models are generally easy to create because their relational structure enables developers to
interact directly with data, tables, and relationships without having to create complex additional structures
such as dimensions and cubes. Tabular data models help to reduce development effort, enabling
developers to get applications into production more quickly, and making it possible for companies to take
advantage of a wider pool of talent to develop Analysis Services solutions. Because relational models are
generally more widely understood than their multidimensional equivalents, in-house relational database
developers should quickly adapt to creating tabular data models with minimal training.

You can create tabular data models using the PowerPivot add-in in Excel, or by using the SQL Server Data
Tools for the BI Visual Studio add-in. You can use SQL Server Management Studio to manage tabular data
model databases in Analysis Services after deploying them.

Note: This module focuses on tabular data models in Analysis Services databases, which are
typically created by BI developers as part of a managed enterprise BI solution. Information
workers can use PowerPivot add-in to create their own tabular data models in Excel workbooks
for personal analysis or to share with colleagues in SharePoint Server. If these PowerPivot data
models become significantly important to the business in the future, they can be imported into a
tabular Analysis Services database and managed centrally by IT.
For more information about PowerPivot, attend course 20467C: Designing Self Service BI and Big
Data Solutions.

Options for Creating an Analysis Services Tabular Data Model Project


An Analysis Services tabular data model is a
database stored on an Analysis Services instance,
running in tabular mode. You can install a SQL
Server 2014 Analysis Services instance in
multidimensional, PowerPivot for SharePoint, or
tabular mode. To create tabular databases, you
must install at least one Analysis Services instance
that runs in tabular mode.

Note: You can switch the mode of an


Analysis Services instance after installation, but
only if you have not yet created a database on
that instance.

You can create a tabular data model project for Analysis Services using the Visual Studio project templates
provided in the SQL Server Data Tools for BI. Each of the three tabular data model templates enables you
to create a project in a different way:

Analysis Services Tabular Project. This template creates a new, empty tabular data model. You must
import data and metadata from your data sources to populate the model.

Import from PowerPivot. This template creates a new tabular data model by using the tabular data
model that is embedded in a PowerPivot for Excel workbook. The Import from PowerPivot option
extracts the data and metadata from the specified workbook and uses this to populate the new
model.
MCT USE ONLY. STUDENT USE PROHIBITED
7-4 Implementing an Analysis Services Tabular Data Model

Import from Server (Tabular). This template creates a new tabular data model from an existing
Analysis Services tabular data model. It extracts the data and metadata from the specified database
and uses this to populate the new model.

Before you create a tabular data model project, you must ensure there is an instance of Analysis Services
running in tabular mode that is available for the project to use during development. The Analysis Services
instance can be local or located on a network. A local instance will typically provide better performance
during the development phase, and makes it possible to work with the model offline. However, a network
instance offers easier collaboration if several developers are involved, particularly if the project is
integrated with Team Foundation Server (TFS).

You can also create a tabular data model by using the Restore from PowerPivot option in SQL Server
Management Studio. This method enables you to create a tabular data model database directly from a
PowerPivot for Excel workbook, without having to use SQL Server Data Tools. Like the Import from
PowerPivot template in SQL Server Data Tools, the Restore from PowerPivot option extracts the data
and metadata from the specified PowerPivot for Excel workbook and uses this to populate the tabular
data model database.

You can optionally configure a tabular database in Analysis Services to use DirectQuery mode, which
passes queries directly to the underlying data source instead of using the in-memory storage engine.
DirectQuery is one of several key features only available in Analysis Services tabular data models and not
in PowerPivot for Excel workbooks. The other key features of Analysis Services tabular data models are:

Row-level security, which enables you to implement security at a much more granular level.

Partitions, which enable you to manage large tables more efficiently.

Using Visual Studio to Develop an Analysis Services Tabular Data Model


SQL Server Data Tools for BI is a Visual Studio-
based comprehensive development environment
for all your BI projects, including SQL Server
Reporting Services projects, SQL Server Analysis
Services Multidimensional projects, and SQL Server
Analysis Services Tabular projects. It supports
source control if used with TFS or other third-party
source control plug-ins, debugging tools, build
and deployment options, and a range of other
features that, together, enable you to manage the
full project development life cycle.

When you create a tabular data model in Visual


Studio, the Tabular Model Designer opens automatically. It consists of several windows, including:

Solution Explorer. This window displays the project and its contents. By default, the contents are the
References container and the Model.bim file. You can set the properties of the project, such as the
deployment server or the query mode, from this window.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 7-5

Designer Window. This window displays a visual representation of the model offering two different
views:

o Data View. This view displays one table at a time, showing the columns and the data the table
contains. You can select the table you want to view by clicking the appropriate tab. The data view
contains the Measure Grid, which you can use to create measures and Key Performance
Indicators (KPIs). You can also create calculated columns in the data view by adding a DAX
expression to the formula bar.

o Diagram View. This view presents the tables in a schema-like diagram. It shows the columns and
hierarchies in each table, and the relationships between the tables. You can also use this view to
create and manage hierarchies.

Properties Window. This window displays the properties of the object you select.
Error List. This window displays errors and other messages relating to the model.

Output Window. This window displays status information relating to builds and deployments. The
Tabular Model Designer also contains extra menu items on the toolbar, including Model, Table, and
Column. From the Model menu, you can launch the Data Import Wizard, which imports data and
metadata creating all the metadata structures that underpin a tabular data model.

The Workspace Database


Creating a tabular data model project using one of
the templates in SQL Server Data Tools for BI
automatically adds a new database to the Analysis
Services Tabular instance that is configured as the
workspace server. Every tabular data model
project has its own exclusive workspace database.
Workspace databases use a naming convention
that combines the project name, the name of the
user who created it, and a system-generated
Globally Unique Identifier (GUID) in the following
format:
<project name>_<user name>_<GUID>

This naming format ensures that every workspace database has a unique name.

The workspace database is an in-memory database containing all the data and metadata for the project
while it is being developed. All the changes you make to the model in SQL Server Data Tools, such as
creating hierarchies or adding linked tables, result in updates to the workspace database. Deploying the
model creates a new databasea renamed copy of the workspace databaseon the designated Analysis
Services instance.

Local and Remote Workspace Databases


Ideally, the workspace database should be located on the same computer where the developer creates the
model because this is likely to result in the best performance. You can use a remote Analysis Services
instance to host the workspace database, but this configuration has the following limitations:

You cannot create tabular data model projects using the Import from PowerPivot template.
You cannot use the Backup to disk option in the Data Backup property.

You may experience slower performance because of the latency introduced by using a remote server.
MCT USE ONLY. STUDENT USE PROHIBITED
7-6 Implementing an Analysis Services Tabular Data Model

Configuring the Workspace Database


You can use the Properties window of the Model.bim file in SQL Server Data Tools to configure the
workspace database. You can set the Workspace Server value to change the Analysis Services instance
that hosts the workspace database.

Note: The default host for new workspace databases is defined in the Default workspace
server setting on the Data Modeling page in the Analysis Services settings in the Options
dialog box, which you can access from the Tools menu on the menu bar.

The Workspace Retention settings define what happens to the in-memory workspace database when
you close a tabular data model project. The default Workspace Retention setting is Unload from
memory, which removes the database from memory and stores it on disk. This setting frees up memory,
but the model takes longer to load when you re-open the project. You can also configure the Workspace
Retention setting to keep the workspace database both in memory and on disk when you close a project,
or to delete it completely. If you choose this last option, the workspace database is recreated every time
you open the project.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 7-7

Lesson 2
Creating a Tabular Data Model
A tabular data model consists of multiple tables, usually linked using relationships. Columns in the tables
can be used as attributes and measures that enable business users to aggregate key business values. You
can organize the attributes by which values are aggregated into hierarchies to enable drill-down/drill-up
analysis.

Lesson Objectives
After completing this lesson, you will be able to:

Import tables into a tabular data model project.

Define measures in a tabular data model.

Manage relationships between tables in a tabular data model.

Configure columns in a tabular data model.


Create hierarchies in a tabular data model.

Importing Tables
The first step in creating a tabular data model is to
import tables of data from one or more sources.
The available data sources for a tabular model
include an instance of the SQL Server database
engine, SQL Server 2014 Analysis Services cubes,
Windows Azure SQL Database, Microsoft Excel,
Microsoft Access, text files, SQL Server Reporting
Services reports, SharePoint lists, data feeds, and
databases running on Oracle, Teradata, Sybase,
Informix, and IBM DB2 database management
systems. There are also OLE DB and ODBC
providers that enable imports from any OLE DB or
ODBC source.

When importing from a database, you can filter tables to exclude columns not required for your analysis.
Always try not to import unnecessary data into the data model to ensure optimal performance and
simplify the end-user experience. You should also provide a friendly name, or alias, for the tables you
import. Tables in databases do not usually have easy-to-understand names, so providing a friendly name
will make the data easier for the end user to interpret.
MCT USE ONLY. STUDENT USE PROHIBITED
7-8 Implementing an Analysis Services Tabular Data Model

Defining Measures
Measures are numeric values in your data model
that users can aggregate to analyze business
performance. Unlike traditional multi-dimensional
data models, in which measures can only be
defined as part of a measure groupusually
based on a fact tabletabular data models enable
you to define measures for any numeric column in
any table.

Measures are aggregated across multiple


dimensions within the data model, and typically
provide a summary value for core business
metrics. For example, a data model including a
sales order table will typically define measures that sum sales revenue, cost, and order quantities by
attributes of business entities defined in other tables, such as customers, products, and stores.

You can define measures in the Measure Grid area of the tabular data model designer in Visual Studio.
Measures are usually based on a simple Data Analysis Expression (DAX) function applied to a numeric
column. For example, a measure named Revenue could be defined as the sum of values in the
SalesAmount column by using the DAX expression Revenue:=Sum([SalesAmount]).

While Sum is the most common aggregate function used to define measures, you can also use Average,
Count, Max, Min, and many others. You can also define more complex measures that reference multiple
columns, or use custom algorithms.

Demonstration: Creating a Tabular Data Model Project


In this demonstration, you will see how to create a tabular data model in Visual Studio.

Demonstration Steps
Create a Tabular Data Model Project

1. Ensure that the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and log
into the 20466C-MIA-SQL virtual machine as ADVENTUREWORKS\Student with the password
Pa$$w0rd. Then, in the D:\Demofiles\Mod07 folder, right-click Setup.cmd and click Run as
administrator. When prompted, click Yes.

2. Start Visual Studio, and on the File menu, point to New, and click Project.

3. In the New Project dialog box, click Analysis Services Tabular Project, in the Name text box, type
TabularDemo, in the Location box browse to D:\Demofiles\Mod11, and then click OK.

4. If the Tabular model designer dialog box is displayed, in the Workspace server list, select
localhost\SQL2, and in the Compatibility level box, select SQL Server 2014 / SQL Server 2012 SP1
(1103), and then click OK.

5. On the Tools menu, click Options, and in the Options dialog box, expand Analysis Services Tabular
Designers and click Workspace Database. Note the name of the default workspace server used to
host the data model during development. Then click Cancel.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 7-9

Import Tables into a Tabular Model

1. On the Model menu, click Import From Data Source.

2. In the Table Import Wizard, on the Connect to a Data Source page, select Microsoft SQL Server,
and then click Next.

3. On the Connect to a Microsoft SQL Server Database page, in the Friendly connection name box,
type AdventureWorksDW, in the Server name box, type MIA-SQL, ensure that Use Windows
Authentication is selected, and in Database name list, select AdventureWorksDW, and then click
Next.

4. On the Impersonation Information page, in the User Name box, type


ADVENTUREWORKS\ServiceAcct, in the Password box, type Pa$$w0rd, and click Next.

5. On the Choose How to Import the Data page, ensure that Select from a list of tables and views
to choose the data to import is selected, and then click Next.

6. On the Select Tables and Views page, select the following source tables, changing the Friendly
Name as indicated in parenthesis:
o DimCustomer (Customer)

o DimDate (Date)

o DimGeography (Geography)
o DimProduct (Product)

o DimProductCategory (Product Category)

o DimProductSubcategory (Product Subcategory)

o FactInternetSales (Internet Sales)

7. Select the row for the DimCustomer table, and click Preview & Filter. Then clear the following
columns and click OK:
o Title

o MiddleName

o NameStyle

o Suffix

o SpanishEducation

o FrenchEducation

o SpanishOccupation

o FrenchOccupation

8. Select the row for the DimDate table, and click Preview & Filter. Then clear the following columns
and click OK:

o SpanishDayNameOfWeek

o FrenchDayNameOfWeek
o SpanishMonthName

o FrenchMonthName

o CalendarSemester

o FiscalSemester
MCT USE ONLY. STUDENT USE PROHIBITED
7-10 Implementing an Analysis Services Tabular Data Model

9. When you have selected and filtered the tables, in the Table Import Wizard dialog box, click Finish
and wait for the data to be imported. When the data has been imported successfully, click Close.

10. Click each of the tabs in the model designer to see the data that has been imported into each table.

Define Measures in a Data Model

1. Click the Internet Sales tab, and then click the first empty cell in the measure grid under the
SalesAmount column.

2. On the Column menu, point to AutoSum, and click Sum.

3. In the formula bar, modify the expression that has been generated to change the name of the
measure to Revenue, as shown in the following example:

Revenue:=SUM([SalesAmount])

4. Widen the SalesAmount column and note that the measure is calculated and displayed as a currency
value. The formatting has been inherited from the column.

5. Select the cell containing the Revenue measure, and press F4 to display the properties pane. Then
view the Format property, which is set to Currency.

6. Create a second measure named Cost in the measure grid under the TotalProductCost column
based on the following expression, and format the Cost measure as currency:

Cost:=Sum([TotalProductCost])

7. On the File menu, click Save All.

Test the Data Model in Excel


1. On the Model menu, click Analyze in Excel.

2. In the Analyze in Excel dialog box, ensure that Current Windows User is selected and that the
(Default) perspective is specified, and click OK.
3. When Excel opens, note that the measures you defined are displayed under an Internet Sales
measure group in the PivotTable Fields pane. The Internet Sales table is also shown in this pane,
and contains the columns in the table, including those on which the aggregated measures are based.
Some users might find this confusing so, in a future refinement of the data model, the columns that
are not useful for analysis should be hidden.

4. Select the Revenue measure so that it is summarized in the PivotTable.


5. In the Date table, select CalendarYear so that the revenue in the PivotTable is aggregated by year.
However, there is no indication in the Date table that tells the user if the values displayed represent
the order date, the due date, or the ship date associated with the order.

6. In the Date table, select EnglishMonthName so that the revenue is further broken down into
monthly totals. Note, however, that the months are shown in alphabetical order instead of
chronological order.
7. Close Excel without saving the workbook.

8. Keep Visual Studio open for the next demonstration.


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 7-11

Managing Relationships
When you import data from a table in a relational
database, you can choose to automatically import
additional tables having a relationship with that
table. The Table Import Wizard identifies related
tables by analyzing their foreign key relationships.
However, data analysts frequently need to analyze
data that comes from multiple sources and for
which explicit relationships are not defined. You
can manually create, edit, and delete relationships
as required in tabular data model projects.

When you create a relationship, you must specify


the two tables you want to relate and the common
column containing the key values. Relationships between tables in tabular data models are one-to-many
so, when you create a relationship, you must specify the columns and tables for each end of it. Tabular
data models support multiple relationships between tables.

For example, date and fact tables in data warehouses often have multiple relationships such as order date,
due date, and delivery date. However, only one relationship is marked as being active between the tables.
When you view multiple relationships in the diagram view of the data model designer, the active
relationship shows as an unbroken line, and non-active relationships appear as dotted lines.

Note: If you need to implement a role-playing dimension in which multiple relationships


between a fact table and a dimension table must be active concurrently, you must import the
dimension table multiple times, assigning a different name each time, and create individual
relationships between the fact table and each copy of the dimension table.

Configuring Columns
The columns in the tables you import into a
tabular data model become the attributes by
which analysts can aggregate measures for
analysis. For example, a Customer table might
include City and Gender columns, enabling
analysts to aggregate numerical measures such as
revenue by city, or order quantity by customer
gender.

Specifying Column Data Types


When you have imported the tables for your data
model, you can specify the data type of each
column. For example, if the Customer table
includes a DateOfBirth column, you can specify that it contains date values.
MCT USE ONLY. STUDENT USE PROHIBITED
7-12 Implementing an Analysis Services Tabular Data Model

Creating Calculated Columns


As well as columns in the source table, you can define additional attributes by creating calculated
columns. A calculated column is based on a DAX formula, which uses a similar syntax to a formula in
Microsoft Excel. For example, in the Customer table described previously, you could concatenate the
FirstName and LastName columns to create a calculated column named FullName, based on the DAX
expression =[FirstName] & " " & [LastName], or create a calculated column named YearOfBirth based
on the DAX expression =YEAR([DateOfBirth]).

Specifying Sort Order for Columns


You can define sort orders for columns, including sorting a column based on the value in another column
in the same table. This approach is most commonly used in time dimension tables containing numeric
values such as a MonthOfYear column with values between 1 and 12, and text values, such as a
MonthName column that has values from January to December. In this example, you can specify that
the MonthName column should be sorted based on the corresponding MonthOfYear value.

Hiding Columns from Client Tools


By default, all columns in all tables are visible when the data model is browsed in a client tool such as
Microsoft Excel. This includes columns that are not useful for aggregation, such as those on which you
have based measures, and columns encapsulated in hierarches. You can make the data model simpler for
users to browse by hiding columns that are not useful as attributes for aggregation.

Demonstration: Managing Relationships and Columns


In this demonstration, you will see how to manage relationships and configure columns in a tabular data
model.

Demonstration Steps
Manage Relationships

1. Ensure that you have completed the previous demonstration then, in Visual Studio, with the
Model.bim pane visible, on the Model menu, point to Model View and click Diagram View.
The tables are shown as a schema diagram, with lines between them to denote relationships. Note that
there are three relationships between the Internet Sales table and the Date table.

2. Double-click the solid line between the Internet Sales and Date tables and note the columns that are
used to define the relationship for this active relationship. Then click Cancel.

3. Double-click each of the dotted lines between the Internet Sales and Date tables and note the
columns that are used to define the relationship for these inactive relationships. Then click Cancel.

4. Right-click each dotted relationship line in turn, and click Delete. When prompted to delete the
relationship from the model, click Delete from Model.

5. Double-click the remaining solid line between the Internet Sales and Date tables, and in the
Column list, select OrderDateKey if it is not already selected. This ensures that the active relationship
is based on the order date, not the shipping or due date. Then click OK.

6. Right-click the Date table title bar and click Rename. Then rename the table to Order Date.
7. On the Model menu, click Existing Connections.

8. In the Existing Connections dialog box, ensure that the AdventureWorksDW connection is selected
and click Open. If you are prompted for impersonation credentials, enter the user name
ADVENTUREWORKS\ServiceAcct and the password Pa$$w0rd, and click OK.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 7-13

9. On the Choose How to Import the Data page, ensure that Select from a list of tables and views
to choose the data to import is selected, and then click Next.

10. On the Select Tables and Views page, select the DimDate table and change the Friendly Name to
Ship Date.

11. Select the row for the DimDate table, and click Preview & Filter. Then clear the following columns
and click OK:

o SpanishDayNameOfWeek

o FrenchDayNameOfWeek
o SpanishMonthName

o FrenchMonthName

o CalendarSemester

o FiscalSemester

12. Click Finish and wait for the table to be imported. When it has been imported successfully, click
Close.

13. Arrange the diagram so that you can see the Ship Date and Internet Sales tables, and then drag the
ShipDateKey column from the Internet Sales table to the DateKey column in the Ship Date table
to create the relationship.
Rename and Hide Columns

1. Click the title bar of the Order Date table. Then click its Maximize icon.

2. In the maximized Order Date table, click the DateKey column, hold the Ctrl key, and click the
DayNumberOfWeek and MonthNumberOfYear columns to select them. Then right-click any of the
selected columns and click Hide from Client Tools.

3. In the maximized Order Date table, right-click the FullDateAlternateKey column and click Rename.
Then rename the column to Date. Repeat this process to rename EnglishDayNameOfWeek to
Weekday, and EnglishMonthName to Month.

4. Click the Restore icon for the Order Date table.

5. Maximize the Internet Sales table then hide all columns other than the Revenue and Cost measures
you created in the previous exercise, and restore the Internet Sales table.

Configure Column Sort Order

1. On the Model menu, point to Model View and click Data View.

2. On the Order Date tab, click the Weekday column heading. Then in the Column menu, point to
Sort and click Sort by Column.
3. In the Sort By Column dialog box, in the Sort column, ensure that Weekday is selected and, in the
By column, select DayNumberOfWeek. Then click OK.

4. Click the Month column heading. Then in the Column menu, point to Sort and click Sort by
Column.

5. In the Sort By Column dialog box, in the Sort column, ensure that Month is selected and, in the By
column, select MonthNumberOfYear. Then click OK.

6. On the File menu, click Save All.


MCT USE ONLY. STUDENT USE PROHIBITED
7-14 Implementing an Analysis Services Tabular Data Model

Review Your Changes in Excel

1. On the Model menu, click Analyze in Excel.

2. In the Analyze in Excel dialog box, ensure that Current Windows User is selected and that the
(Default) perspective is specified, and click OK.

3. When Excel opens, note that the measures you defined are still displayed under an Internet Sales
measure group in the PivotTable Fields pane, but that the Internet Sales table is no longer shown
here because all other columns have been hidden.

4. Select the Revenue measure so that it is summarized in the PivotTable.


5. In the Order Date table, select CalendarYear so that the revenue in the PivotTable is aggregated by
year based on the order date.

6. In the Order Date table, select Month so that the revenue is further broken down into monthly
totals. Note that the months are now shown in chronological order.

7. In the Order Date table, select CalendarQuarter and note that the quarter is shown under each
month.
8. In the PivotTable Fields pane, in the Rows area, drag CalendarQuarter so that it is above Month,
and the PivotTable shows the correct hierarchy for year, quarter, and month. This user experience
could be improved by enabling users to select a hierarchy that automatically enables them to drill
down to the correct levels.

9. Close Excel without saving the workbook.

10. Keep Visual Studio open for the next demonstration.

Creating Hierarchies
A hierarchy is a collection of attributes from a
single table organized as parent and child nodes.
You create hierarchies to simplify the end-user
experience and enable drill-up/drill-down
aggregations. For example, you might have
Product Category, Subcategory, and Product
Name columns in a table. These values represent a
natural hierarchy in which you can organize data,
enabling users to view aggregations such as total
sales revenue at the year, quarter, and month
levels.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 7-15

Demonstration: Creating a Hierarchy


In this demonstration, you will see how to create a hierarchy in a tabular data model.

Demonstration Steps
Create a Hierarchy

1. Ensure that you have completed the previous demonstration. Then, in Visual Studio, with the
Model.bim pane visible, on the Model menu, point to Model View and click Diagram View.

2. Maximize the Order Date table, and then on its title bar, click the Create Hierarchy icon. Name the
new hierarchy Calendar Date.

3. Drag the CalendarYear column to the Calendar Date hierarchy.

4. Drag the CalendarQuarter column to the Calendar Date hierarchy.

5. Right-click the Month column, point to Add to Hierarchy, and click Calendar Date.

6. Right-click the Date column, point to Add to Hierarchy, and click Calendar Date.

7. Restore the Order Date table, and on the File menu, click Save All.

Browse the Hierarchy in Excel

1. On the Model menu, click Analyze in Excel.

2. In the Analyze in Excel dialog box, ensure that Current Windows User is selected and that the
(Default) perspective is specified, and click OK.

3. When Excel opens, select the Revenue measure so that it is summarized in the PivotTable.

4. In the Order Date table, select Calendar Date so that the revenue in the PivotTable is aggregated by
year.

5. In the PivotTable, expand the years to show the revenue by quarter, expand the quarters to view
revenue by month, and expand the months to view revenue for individual dates.
6. Close Excel without saving the workbook.

7. Keep Visual Studio open for the next demonstration.


MCT USE ONLY. STUDENT USE PROHIBITED
7-16 Implementing an Analysis Services Tabular Data Model

Lesson 3
Using an Analysis Services Tabular Data Model in the
Enterprise
SQL Server 2014 Analysis Services includes tabular data model capabilities that can be useful in an
enterprise BI scenario. It enables you to define perspectives as subsets of the data model for specific user
groups, and partition the tables in the data model to improve data refresh processing performance. There
is also a choice of query modes to help you optimize the right balance of query and processing
performance, and the ability to define security restrictions on a data model.

Lesson Objectives
After completing this lesson, you will be able to:

Create perspectives in a tabular data model.


Create partitions in a tabular data model.

Implement DirectQuery mode in a tabular data model.

Implement Security in a tabular data model.


Deploy a tabular data model.

Perspectives
Tabular data models can contain large numbers of
tables and measures, which can make it difficult
for users to work with the data in a client tool such
as Excel. You can simplify the user experience by
creating perspectives that display only a subset of
the tables, columns, and measures in a model. For
example, the tables into which you import data in
a tabular data model from a database might
contain many different types of data, such as
human resources, sales, product, promotions, and
financial data. However, users who want to analyze
sales performance will probably not need to view
the financial or human resources tables, so you could create a perspective where these are explicitly
excluded. When users connect to the model with this perspective, they will not see the excluded tables in
their client application.

Creating Perspectives
You can create a perspective in an Analysis Services tabular data model by using the Perspectives dialog
box in Visual Studio. You can open this dialog box by clicking the Model menu, and then clicking
Perspectives.

When you create a perspective, you must provide a name for it and specify the tables, measures, and
columns to include. When users connect to the model, they will be able to select the perspective they
want to use, so you should ensure that the names you give to perspectives are descriptive and easy to
understand. To speed up the process, you can also create a perspective by copying an existing similar one.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 7-17

Perspectives and Security


You cannot use perspectives to secure tabular data models. As you are unable to define permissions on a
perspective, users with the right to view the tabular data model will be able to use any perspective
defined in the model. Additionally, if a user does not have permissions to view certain tables, they cannot
circumvent this by using a perspective. If you need to secure your tabular data models, you should create
roles and define permissions for the required objects.

Partitions
Partitions provide a way for you to manage large
tables more efficiently. You can divide a table up
into logical units that you can load or reload
individually. For example, imagine you have a
large table called Sales, containing sales data, in
your tabular data model. The Sales source table is
a data warehouse fact table that is updated daily
and, as a result, the data quickly goes out of date.
You can refresh the data in Sales, but this
operation is time consuming as it reloads all the
rows from the source table. To help keep Sales
synchronized with the source table, you could
create partitions that divide up the table, perhaps by using the by year or month value in the column
containing the order date as the key. You now only need to regularly refresh the partition that contains
the most recent data, updating the other partitions only when required.

Note: Partitions in tabular data models are not intended to improve query performance in
the way that partitions in other types of databases can. Partitioning can, however, improve data
model processing performance by enabling you to process only partitions that contain new or
updated data.

Creating Partitions
Every table you import into a tabular data model has a single partition associated with it by default. This
enables you to refresh each table separately. During the development stage, you can create a new
partition in a tabular data model by using the following steps:

1. In the Tabular Model Designer in Visual Studio, click the Table menu, and then click Partitions.

2. In Partition Manager, select the table for which you want to create a partition, and then click New.

Note: You can also create a partition by copying an existing one.

3. To select the rows to include in the partition, use the Table Preview pane to select the rows to
include in the partition, or use the Query Editor pane to select the rows to include in the partition by
defining a Transact-SQL query. The partitions you create in this way are added to the workspace
database for the project.

For tabular data models deployed to an Analysis Services instance, you can create and manage partitions
by using SQL Server Management Studio.
MCT USE ONLY. STUDENT USE PROHIBITED
7-18 Implementing an Analysis Services Tabular Data Model

Using Partitions in the Data Modeling Phase


When creating a tabular data model using a source database with large tables, you should consider
implementing partitions to enable you to work with a reduced sample data set. Using a sample data set
during development is more efficient because it reduces data load times and enables better query
performance. You can create, delete, edit, and merge partitions in the workspace database as required.
When you deploy the model, you can import all the data and employ a different partitioning design
appropriate to the deployed database.

Processing Partitions
Processing populates the tables and partitions in a tabular data model by importing data from the source
database. You can choose to process individual tables or partitions, or all of the tables and partitions in a
model together. Depending on the option you choose, processing also rebuilds relationships and
hierarchies as well as recalculating calculated columns and measures. The following processing options are
available:

Process Default. This option detects the current state of a partition and performs the necessary
actions to ensure it is in a fully-processed state. When you first create a partition, this option loads it
and creates all hierarchies, relationships, measures, and calculated columns.

Process Full. This option fully processes a partition. It removes existing data and repopulates
partitions, before recreating hierarchies, relationships, measures, and calculated columns.

Process Data. This option only processes data, and does not recreate hierarchies, relationships,
measures, and calculated columns.
Process Clear. This option deletes data from a partition.

Process Add. This option updates partitions by only adding new data.

You can process partitions in SQL Server Data Tools by clicking the Refresh Partitions button on the
toolbar, and then clicking Refresh Partitions.

You can process partitions in SQL Server Management Studio by using the Process Partitions dialog box,
which you can access by right-clicking the table that you need to update, clicking Partitions, and then
clicking Process.

DirectQuery Mode
The xVelocity engine provides in-memory storage
and data processing for very fast query handling,
and is appropriate for most Analysis Services
tabular data models. However, you can also
configure a tabular data model to bypass the
xVelocity engine and route queries directly to the
underlying data source by using DirectQuery
mode. This option might be suitable in the
following circumstances:

For tabular data models with very large data


sets that take a long time to load into
memory, and which might fail to load if there
is insufficient memory available. Using DirectQuery removes the need to load data into memory.

For tabular data models that need to contain up-to-date information and for which frequent
processing is impractical. DirectQuery retrieves source data directly, providing access to the most
recent data.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 7-19

For organizations that have invested in high-specification data warehouse hardware that can deliver
excellent performance. DirectQuery enables companies to take advantage of their hardware
performance while still providing access to data through a tabular data model. In this situation, you
should test thoroughly to determine whether you can achieve the best performance by relying on the
data warehouse hardware or by using the xVelocity engine.

Designing Tabular Data Models for use with DirectQuery


When you design tabular data models for which you intend to enable DirectQuery mode, there are
several points to consider:

DirectQuery only supports a single data source, which must be a SQL Server relational database.

You cannot use linked tables or non-linked tables in the model.

You cannot use calculated columns in the model.


You cannot use certain DAX functions, like those for time intelligence, in the model. Others, such as
certain statistical functions, might behave differently in DirectQuery mode.

You cannot use row-level security in the model.


Only client applications that support DAX queries can connect to the model. Client applications that
use MDX to query the model will not be able to connect to it.

Enabling DirectQuery Mode


You can enable DirectQuery for a database in SQL Server Management Studio by configuring the
DirectQueryMode setting in the database properties. You can enable DirectQuery in SQL Server Data
Tools by configuring the DirectQueryMode setting in the properties of the Model.bim file.

Tabular Data Model Security


You can configure security in a tabular data model
by creating database roles, and then applying
permissions and filters to them. You should use
roles to gather together Microsoft Windows user
accounts and groups with the same data access
requirements, and then apply the permissions to
the role. You can create and manage roles in a
tabular data model in the development phase by
using the Role Manager dialog box in SQL Server
Data Tools. You can access this dialog box by
clicking Model on the taskbar and then clicking
Roles. You can add members to roles in a tabular
data model you have deployed by using SQL Server Management Studio, but you cannot create roles
using this tool.
MCT USE ONLY. STUDENT USE PROHIBITED
7-20 Implementing an Analysis Services Tabular Data Model

Database Permissions
You can use Role Manager to define database permissions for roles. You can define permissions that allow
five levels of access to a tabular data model database. They are:

Read. This permission gives role members read access to the database.

Read and Refresh. This permission gives role members read access to the database and allows them
to refresh data.

Refresh. This permission gives role members the ability to refresh data in the database.

Administrator. This permission gives role members administrative rights for the database.

None. This permission prevents role members from accessing the database.

Row-Level Security
You can create DAX expressions that filter the rows that role members can see. DAX filters are expressions
that evaluate to TRUE or FALSE. Rows that evaluate to TRUE are visible to role members. For example, the
expression in the following code example filters the Region column in the Sales Territory table:

Region[Country]="France"

The rows that contain the value France evaluate to TRUE, so they are visible to role members. Rows
containing other values in the Region column are not visible.

You can only define DAX filters for roles that have the Read and Read and Refresh database permissions.
Roles with None or Refresh permissions cannot view any data, therefore filters are irrelevant. Roles that
have the Administrator permission can view all data.

You can test the database permissions and row-level filters you define by using the Analyze in Excel
option in SQL Server Data Tools, and connecting as a specific role.

Deploying a Tabular Data Model


When you have completed the development
phase for a tabular data model, you can deploy it
to an Analysis Services instance running in tabular
mode. This enables information workers and data
analysts to connect to the data model by using
client tools such as Excel and Power View.
Deploying a tabular data model copies the data
and metadata in the workspace database and uses
it to create a new database which has the same
name as the project by default. You can deploy to
the same server you used to develop the model
but, more often, you will deploy to a different
Analysis Services instance, such as a test or production server.
You can control this process by using the Deployment Options and Deployment Server settings in the
project properties in SQL Server Data Tools.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 7-21

Deployment Options
Processing Option. Use this setting to define how the new database is processed. There are three
options:

Default. When you select this option, Analysis Services detects the state of the database and
processes the data and metadata accordingly.

Full. When you select this option, Analysis Services processes all data and metadata.

Do Not Process. When you select this option, Analysis Services deploys only the metadata.

Transactional Deployment. By default, Analysis Services will deploy objects even if processing for
those objects fails. You can use this option to force deployment to fail if processing fails.

Query Mode. Use this option to select the DirectQuery or InMemory mode for data storage.

Impersonation Settings. Use this option to specify the credentials to use when connecting to the
data sources for the tabular data model.

Deployment Server
Server. Use this option to specify the name of the deployment server.
Edition. Use this option to specify the edition of the Analysis Services instance on the deployment
server.

Database. Use this option to specify the name of the database on the deployment server.
Cube Name. Use this option to specify the name that client tools see when they connect to the
model.

Client Connections
Clients can connect directly to a tabular data model in Analysis Services by using tools such as Excel, but
you can make it easier for them by creating a BI Semantic Model Connection in the PowerPivot Gallery. A
BI Semantic Model Connection enables users to connect to a tabular database in Analysis Services or a
PowerPivot workbook by using Excel or Power View with just a click. Users do not need to create
connections manually.

Note: Before you can create a BI Semantic Model Connection in a SharePoint document
library, it must be configured to support the BI Semantic Model Connection content type. You
can enable content types in the Library Settings page for any document library where you want
users to be able to publish connections to tabular data models.

Demonstration: Deploying a Tabular Data Model


In this demonstration, you will see how to deploy a tabular data model to Analysis Services.

Demonstration Steps
Deploy a Tabular Data Model Project

1. Ensure you have completed the previous demonstration in this module.

2. In Visual Studio, in Solution Explorer, right-click the TabularDemo project and click Properties.

3. In the TabularDemo Property Pages dialog box, in the Deployment Server section, verify that the
Server value is localhost\SQL2.
MCT USE ONLY. STUDENT USE PROHIBITED
7-22 Implementing an Analysis Services Tabular Data Model

4. Change the Database property to DemoDB and change the Cube Name property to Internet Sales.
Then click OK.

5. On the Build menu, click Deploy TabularDemo.

6. In the Deploy dialog box, when deployment has completed, click Close.

7. Close Visual Studio.


Use a Tabular Analysis Services Database in Excel

1. Start Excel and create a new blank document.

2. On the Data tab, in Get External Data area, in the From Other Sources drop-down list, select From
Analysis Services.

3. In the Data Connection Wizard dialog box, in the Server name box, type MIA-SQL\SQL2. Ensure
that Use Windows Authentication is selected, and click Next.

4. On the Select Database and Table page, ensure that the DemoDB database and the Internet Sales
cube are selected. Then click Next.

5. On the Save Data Connection File and Finish page, click Finish.

6. In the Import Data dialog box, ensure that the Existing Worksheet option is selected and click OK.

7. In the PivotTable Fields pane, under Internet Sales, select Revenue.

8. In the PivotTable Fields pane, under Order Date, select the Calendar Date hierarchy.

9. In the PivotTable Fields pane, under Geography, drag EnglishCountryRegionName to the


Columns area.

10. Explore the data in the PivotTable. When you have finished, close Excel without saving the workbook.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 7-23

Lab: Implementing an Analysis Services Tabular Data


Model
Scenario
As a BI developer at Adventure Works Cycles, you have been tasked with creating an Analysis Services
data model for reseller sales analysis.

Objectives
After completing this lab, you will be able to:

Create an Analysis Services tabular model.

Configure columns and relationships.


Deploy an Analysis Services tabular data model.

Estimated Time: 60 Minutes

Virtual machine: 20466C-MIA-SQL

User name: ADVENTUREWORKS\Student

Password: Pa$$w0rd

Exercise 1: Creating an Analysis Services Tabular Data Model Project


Scenario
Adventure Works Cycles has a data warehouse containing fact and dimension tables for sales. You plan to
use the tables relating to reseller sales as the basis of your data model.

The main tasks for this exercise are as follows:

1. Prepare the Lab Environment


2. Create a Tabular Analysis Services Project

3. Import Tables Into the Data Model

4. Create Measures

5. Test the Model

Task 1: Prepare the Lab Environment


1. Ensure the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then log
on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. Run Setup.cmd in the D:\Labfiles\Lab07\Starter folder as Administrator.

Task 2: Create a Tabular Analysis Services Project


1. Use Visual Studio to create an Analysis Services tabular project named AWSalesTab in the
D:\Labfiles\Lab07\Starter folder.

2. Use the localhost\SQL2 instance of Analysis Services as the workspace server, and set the
compatibility level of the project to SQL Server 2014 / SQL Server 2012 SP1 (1103).
MCT USE ONLY. STUDENT USE PROHIBITED
7-24 Implementing an Analysis Services Tabular Data Model

Task 3: Import Tables Into the Data Model


1. Open the Model.bim model and import the following tables from the AdventureWorksDW database
on the MIA-SQL instance of SQL Server (use the friendly names in parentheses):

o DimDate (Date)

o DimEmployee (Employee)

o DimGeography (Geography)
o DimProduct (Product)

o DimProductCategory (Product Category)

o DimProductSubcategory (Product Subcategory)

o DimReseller (Reseller)

o FactResellerSales (Reseller Sales)

2. Filter the following tables to remove the columns listed below:


o DimDate
SpanishDayNameOfWeek
FrenchDayNameOfWeek
DayNumberOfYear
WeekNumberOfYear
SpanishMonthName
FrenchMonthName
CalendarSemester
FiscalSemester
o DimEmployee
SalesTerritoryKey
NameStyle
Title
HireDate
BirthDate
LoginID
EmailAddress
Phone
MaritalStatus
EmergencyContactName
EmergencyContactPhone
SalariedFlag
Gender
PayFrequency
Baserate
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 7-25

VacationHours
SickLeaveHours
CurrentFlag
SalesPersonFlag
StartDate
EndDate
Status
o DimGeography
SpanishCountryRegionName
FrenchCountryRegionName
IpAddressLocator
o DimProduct
WeightUnitMeasureCode
SizeUnitMeasureCode
SpanishProductName
FrenchProductName
FinishedGoodsFlag
SafetyStockLevel
ReorderPoint
DaysToManufacture
ProductLine
DealerPrice
Class
Style
ModelName
FrenchDescription
ChineseDescription
ArabicDescription
HebrewDescription
ThaiDescription
GermanDescription
JapaneseDescription
TurkishDescription
StartDate
EndDate
Status
MCT USE ONLY. STUDENT USE PROHIBITED
7-26 Implementing an Analysis Services Tabular Data Model

o DimProductCategory
SpanishProductCategoryName
FrenchProductCategoryName
o DimProductSubcategory
SpanishProductSubcategoryName
FrenchProductSubcategoryName
o DimReseller
OrderFrequency
OrderMonth
FirstOrderYear
LastOrderYear
ProductLine
AddressLine1
AddressLine2
AnnualSales
BankName
MinPaymentType
MinPaymentAmount
AnnualRevenue
YearOpened
o FactResellerSales
DueDateKey
PromotionKey
CurrencyKey
SalesTerritoryKey
RevisionNumber
CarrierTrackingNumber
CustomerPONumber
DueDate

Task 4: Create Measures


1. In the model that has been created, edit the Reseller Sales table to add the following measures:

o Quantity:=Sum([OrderQuantity])

o Cost:=Sum([TotalProductCost])

o Revenue:=Sum([SalesAmount])

2. Format the Quantity measure as a whole number that includes a thousand separator, and ensure that
the Cost and Revenue measures are formatted as currency values.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 7-27

Task 5: Test the Model


1. Analyze the model you have created in Excel.

2. Note the following problems with the model:

o Aggregated measures from the Reseller Sales table are easily confused with the columns on
which they are based.

o Column names are not user-friendly.


o Months in the Date table are sorted alphabetically, not chronologically.

o The relationship between Reseller Sales and Date is ambiguous, because sales orders include an
order date and a ship date.

3. You will fix these problems in the next exercise.

Results: After this exercise, you should have created a tabular data model project.

Exercise 2: Configuring Columns and Relationships


Scenario
You have created an initial tabular data model, but have identified some problems with column names
that are not user-friendly, sorting of some column values, and ambiguous relationships.

The main tasks for this exercise are as follows:

1. Configure Relationships
2. Rename and Hide Columns

3. Configure Column Sort Order

4. Create Hierarchies

5. Test the Model

Task 1: Configure Relationships


1. View the existing relationships between the Reseller Sales and Date tables.

2. Make the active relationship the one based on the order date, delete all other relationships between
these tables, and rename the Date table to Order Date.

3. Open the existing connection to the AdventureWorksDW database and import the DimDate table
again.

o Use the ADVENTUREWORKS\ServiceAcct account with the password Pa$$w0rd if you are
prompted for impersonation credentials.

o Use the friendly name Ship Date for the DimDate table.

o Filter the table to delete the following columns:


SpanishDayNameOfWeek
FrenchDayNameOfWeek
DayNumberOfYear
WeekNumberOfYear
SpanishMonthName
MCT USE ONLY. STUDENT USE PROHIBITED
7-28 Implementing an Analysis Services Tabular Data Model

FrenchMonthName
CalendarSemester
FiscalSemester
4. Create a relationship between the Reseller Sales and Ship Date tables based on the ShipDateKey
and DateKey columns.

Task 2: Rename and Hide Columns


1. In the Geography table, hide the GeographyKey and SalesTerritoryKey columns in the Geography
table, and rename the following columns:

Column New Name

StateProvinceCode State or Province Code

StateProvinceName State or Province

CountryRegionCode Country or Region Code

EnglishCountryRegionName Country or Region

PostalCode Postal Code

2. In the Reseller table, hide the ResellerKey, GeographyKey, and ResellerAlternateKey columns and
rename the following columns:

Column New Name

BusinessType Business Type

ResellerName Reseller Name

NumberEmployees Employees

3. In the Employee table, hide the EmployeeKey, ParentEmployeeKey,


EmployeeNationalIDAlternateKey, and ParentEmployeeNationalIDAlternateKey columns and
rename the following columns:

Column New Name

FirstName First Name

LastName Last Name

MiddleName Middle Name

DepartmentName Department

EmployeePhoto Photo
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 7-29

4. In the Order Date table, hide the DateKey, DayNumberOfWeek, and MonthNumberOfYear
columns and rename the following columns:

Column New Name

FullDateAlternateKey Date

EnglishDayNameOfWeek Weekday

DayNumberOfMonth Day of Month

EnglishMonthName Month

CalendarQuarter Calendar Quarter

CalendarYear Calendar Year

FiscalQuarter Fiscal Quarter

FiscalYear Fiscal Year

5. In the Ship Date table, hide the DateKey, DayNumberOfWeek, and MonthNumberOfYear
columns and rename the following columns:

Column New Name

FullDateAlternateKey Date

EnglishDayNameOfWeek Weekday

DayNumberOfMonth Day of Month

EnglishMonthName Month

CalendarQuarter Calendar Quarter

CalendarYear Calendar Year

FiscalQuarter Fiscal Quarter

FiscalYear Fiscal Year


MCT USE ONLY. STUDENT USE PROHIBITED
7-30 Implementing an Analysis Services Tabular Data Model

6. In the Product table, hide the ProductKey, ProductAlternateKey, and ProductSubcategoryKey


columns and rename the following columns:

Column New Name

EnglishProductName Product Name

StandardCost Standard Cost

ListPrice List Price

SizeRange Size Range

LargePhoto Photo

EnglishDescription Description

7. In the Product Subcategory table, hide the ProductSubcategoryKey,


ProductSubcategoryAlternateKey, and ProductCategoryKey columns and rename the following
EnglishProductSubcategoryName column to Subcategory.

8. In the Product Category table, hide the ProductCategoryKey and ProductCategoryAlternateKey


columns and rename the EnglishProductCategoryName column to Category.

9. In the Reseller Sales table, hide all columns other than the Quantity, Cost, and Revenue measures
you created in the previous exercise.

Task 3: Configure Column Sort Order


1. In the Ship Date table, configure the sort orders of the following columns:

o The Weekday column should be sorted by the DayNumberOfWeek column.

o The Month column should be sorted by the MonthNumberOfYear column.


2. In the Order Date table, configure the sort orders of the following columns:

o The Weekday column should be sorted by the DayNumberOfWeek column.

o The Month column should be sorted by the MonthNumberOfYear column.

Task 4: Create Hierarchies


1. In the Geography table, create a hierarchy named Location that includes the following columns:

o Country or Region

o State or Province

o City

o Postal Code

2. In the Order Date table, create a hierarchy named Calendar Date that contains the following fields:

o Calendar Year

o Calendar Quarter

o Month

o Day of Month
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 7-31

3. In the Order Date table, create a second hierarchy named Fiscal Date that contains the following
fields:

o Fiscal Year

o Fiscal Quarter

o Month
o Day of Month

Task 5: Test the Model


1. View the data model in Excel, and verify that the changes you have made are reflected in the
PivotTable.

2. When you have finished, close Excel without saving the workbook.

Results: After completing this exercise, you should have a tabular data model that includes renamed
columns, custom sort orders, and specific relationships between tables.

Exercise 3: Deploying an Analysis Services Tabular Data Model


Scenario
The tabular data model is now ready for use, so you must deploy it to a server.

The main tasks for this exercise are as follows:

1. Deploy the Reseller Sales Project

2. Use the Deployed Tabular Database

Task 1: Deploy the Reseller Sales Project


1. Configure the project for deployment:

o The project should be deployed to the localhost\SQL2 instance of Analysis Services.


o The deployed database should be named AdventureWorksTab.

o The deployed cube should be named Reseller Sales.

2. Build and deploy the project. Then close Visual Studio.

Task 2: Use the Deployed Tabular Database


1. Create a blank workbook in Excel then import the data model from Analysis Services.

o Use the From Analysis Services source on the Data tab of the ribbon.

o Connect to the MIA-SQL\SQL2 instance of Analysis Services using Windows authentication.


o Select the Reseller Sales cube in the AdventureWorksTab database.

o Import the data into a PivotTable in the workbook.

2. Explore the data in the PivotTable. When you have finished, close Excel without saving the workbook.

Results: After this exercise, you should have deployed the tabular data model project.
MCT USE ONLY. STUDENT USE PROHIBITED
7-32 Implementing an Analysis Services Tabular Data Model

Module Review and Takeaways


In this module, you have learned how to create a tabular data model for Analysis Services by using SQL
Server Data Tools.

Review Question(s)
Question: What are the advantages and disadvantages of tabular models when compared to
multidimensional models?
MCT USE ONLY. STUDENT USE PROHIBITED
8-1

Module 8
Introduction to DAX
Contents:
Module Overview 8-1

Lesson 1: DAX Fundamentals 8-2

Lesson 2: Enhancing a Tabular Data Model with DAX 8-9

Lab: Using DAX to Enhance a Tabular Data Model 8-20

Module Review and Takeaways 8-26

Module Overview
You can extend Microsoft SQL Server Analysis Services tabular data models by using the Data Analysis
Expressions (DAX) language. DAX is a highly flexible language that enables you to create measures and
calculated columns for use in PivotTable tables and PivotChart charts.

Objectives
This module explains the fundamentals of the DAX language. It also explains how you can use DAX to
create calculated columns and measures, and how you can use them in your tabular data models.

After completing this module, you will be able to:

Describe the fundamentals of DAX.

Use DAX to create calculated columns and measures.


MCT USE ONLY. STUDENT USE PROHIBITED
8-2 Introduction to DAX

Lesson 1
DAX Fundamentals
To use DAX effectively, you should understand the components of the language and its capabilities. This
lesson explains the fundamentals of DAX and provides an overview of the functionality it offers.

Lesson Objectives
After completing this lesson, you will be able to:

Describe DAX and how it is used.

Describe the different types of DAX functions.

Describe the syntax for writing DAX formulas.

Describe aggregations.

Explain why context is important when writing DAX formulas.

Describe the basics of DAX queries.

Overview of DAX
The DAX language consists of a library of
functions, operators, and constants enabling you
to create formulas that define the calculated
columns and measures to extend the functionality
of tabular data models. DAX was introduced in the
first version of the PowerPivot for Excel add-in, as
the language for writing the formulas that define
business logic in PowerPivot for Excel workbooks.
In SQL Server 2014, you can use DAX in tabular
data models in SQL Server Analysis Services as well
as those you create in PowerPivot for Excel
workbooks.

Typical uses of DAX


DAX makes it possible to extend tabular data models beyond the basic aggregation of numerical columns
based on tables and hierarchies. With DAX, you can:

Create columns and measures based on complex custom calculations.

Create columns that reference data in related tables.

Define parent-child hierarchies.

Create Key Performance Indicators (KPIs).


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 8-3

DAX Syntax
DAX is based on the Microsoft Excel formula syntax and uses a range of functions, operators, and values.
However, unlike formulas in Excel, DAX does not work on individual cells or ranges of cells. Instead, you
use it to work with the relational columns and tables that you import into a tabular data model. DAX also
includes more advanced functionality provided by Excel formulas. The similarity between DAX and Excel
formulas minimizes the learning time for information workers who want to learn to use DAX, enabling
faster adoption of the language and reducing the need for training.

DAX Functions
DAX provides a range of functions you can use to
build formulas. Many of these functions are the
same as or similar to those use to build formulas in
Excel. DAX, however, includes many additional
functions that enable you to create much more
complex and sophisticated reports than you can
by using only Excel functions. For example, DAX
includes functions that can navigate table
relationships, take account of context to calculate
values in PivotTable tables, or compare time
periods to produce the desired result. A DAX
formula can contain multiple functions that can be
nested within other functions (up to 64 levels deep).

Note: You should avoid having too many levels of nesting in your formulas because it
makes them difficult to write and troubleshoot.

DAX includes several categories of functions:

Text. These functions include CONCATENATE, TRIM, and BLANK. They are based on the Excel string
functions, and enable you to manipulate strings.

Information. These functions include PATH, ISBLANK, and ISERROR. They enable you to test row
values to see whether they match a type of value such as an error or a null, and to handle
relationships in a table. This category also includes functions such as USERNAME, which you can use
to implement row-level security in SQL Server Analysis Services tabular data models.

Filter and value. These functions include CALCULATE, ALL, and FILTER. They enable you to
manipulate data in a variety of ways, such as modifying the context for calculations, or using the
relationships between tables to return specific results. There is no direct equivalent of these functions
in Excel.

Logical. These functions include TRUE, IF, SWITCH, and NOT. They enable you to perform logical tests
on expressions to return information about the state of the tested values, and to use conditional
logic.

Mathematical and trigonometric. These functions include ROUND, FLOOR, and POWER. They are
similar to the mathematical and trigonometric functions in Excel.

Statistical and aggregation. These functions include SUM, COUNT, RANKX, and DISTINCTCOUNT.
They are similar to aggregation functions in Excel.
MCT USE ONLY. STUDENT USE PROHIBITED
8-4 Introduction to DAX

Date and time. These functions include DATE, MONTH, and YEAR. They are equivalent to date and
time functions in Excel.

Time intelligence. These functions include PARALLELPERIOD, TOTALYTD, and


OPENINGBALANCEYEAR. They enable you to work with columns using the date data type in a more
sophisticated way, for example, by comparing specific business measures across time periods.

Navigation and Lookup. DAX is designed to work with multiple tables and columns in a tabular data
model. It provides functions such as RELATEDTABLE, USERELATIONSHIP, CROSSJOIN, and
LOOKUPVALUE, which you can use to combine data from multiple tables. You can also use functions
such as PATH and PATHITEM to define navigable hierarchies of column values in a single table.

Reference Links: For more information about functions supported in DAX, go to DAX
Function Reference in SQL Server Books Online at
http://go.microsoft.com/fwlink/?LinkID=248848.

DAX Syntax and Data Types


DAX is syntactically similar to the Excel formula
language, so users already skilled in this area
should be able to start creating DAX formulas
quickly. The following list summarizes basic DAX
syntax and conventions:

All DAX formulas must start with an equal sign


(=), followed by an expression.

Expressions can contain functions, operators,


constants, and references to columns in tables.
Object names that are referenced in
expressions, such as table and column names,
are case insensitive and can include spaces.
You cannot use leading or trailing spaces, invalid characters in the names of SQL Server Analysis
Services objects, or control characters in table, column, or measure names. Arithmetic, comparison,
text concatenation, and logical operators are largely the same as for formulas in Excel, and mostly
work in the same way.

DAX functions use tables and columns as inputs. To avoid ambiguity, you should use fully qualified
names when referencing columns. A fully qualified name uniquely identifies a column because it
includes the name of the table to which the column belongs. Use brackets around the column names
and single straight quotation marks around table names that contain spaces, for example, 'Order
Date'[Calendar Year]. You can use unqualified names to refer to columns in the same table. For
example, when used in the Order Date table, the 'Order Date'[Calendar Year] column can be
referenced as [Calendar Year].

When you are writing and referencing one measure to another, you should enclose the measure name in
brackets, for example, [Total Profit].

Note: PowerPivot for Excel and SQL Server Data Tools for Business Intelligence (BI) support
AutoComplete. You should use AutoComplete to ensure accuracy and compliance with
syntactical requirements.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 8-5

Data Types
DAX uses the following data types:

I8 (eight-byte integer).

R8 (eight-byte real number).

Boolean.

String.

Date.

CY (Currency).

BLANK. Represents a blank or missing value or an empty cell in an Excel worksheet. You can test for
blank values by using the ISBLANK function.

Table. Used by functions that either require a table as input or return a table.

DAX will implicitly convert data types where possible, so there is no need to explicitly convert a column to
a different data type. For example, if you enter a date as a string and use that value in a date function,
DAX will attempt to convert the data type to Date. There are no DAX functions for explicitly converting
data types.

Note: BLANK and NULL are not the same thing. Handling of BLANK data types varies
depending upon the operation that you perform. For example, BLANK + 5 = 5, but BLANK * 5 =
BLANK. However, in SQL Server, both NULL + 5 and NULL * 5 result in a NULL value. SQL Server
NULL values are implicitly converted to BLANK values when you import data into a tabular data
model.

Aggregations
Information workers are typically interested in
finding answers to questions such as: "How much
revenue did the company generate though its
Internet sales channel in April?" or: "What was the
best selling product in France last year?"
Answering these kinds of questions requires you to
group and aggregate data. You group data by
using values that give meaning and context, such
as Product Category or Sales Region. These values
are usually stored in the tables you import from
data sources. You can aggregate data by using
DAX aggregation functions. The following list
contains commonly used aggregation functions:

SUM. Adds up the values in a numeric column, providing a single total for that column. For example,
the following code totals all the values in the Sales Amount column in the Reseller Sales table:

=SUM('Reseller Sales'[Sales Amount])


MCT USE ONLY. STUDENT USE PROHIBITED
8-6 Introduction to DAX

SUMX. Filters and then adds up the values in a numeric column, returning a single total for that
filtered column. For example, the following code returns the total of the Freight column in the
Internet Sales table, but the calculation only includes rows for which the SalesTerritoryKey
value is 5:

=SUMX(FILTER('Internet Sales', 'Internet Sales'[SalesTerritoryKey]=5),[Freight])

COUNT. Returns a count of the number of rows that are not blank for columns containing either
numbers or dates. COUNTA is similar, but returns a count of the number of rows that are not blank
for columns containing non-numeric values. For example, the following code returns a count of the
rows in the Order Date column in the Reseller Sales table:

=COUNT('Reseller Sales'[Order Date])

COUNTAX. Filters and then counts the number of values that are not blank in a column and returns a
single count for that filtered column. For example, the following code returns the number of rows in
the Phone column in the Reseller table, where the value in the Status column is set to "Active":

=COUNTAX(FILTER('Reseller',[Status]="Active"),[Phone])

The COUNTX function works in the same way as COUNTAX, but only counts numeric values. The
COUNTBLANK function returns the number of blank values in a column, and the COUNTROWS function
counts the number of rows in a table.
MIN and MAX. Return the minimum and maximum values in a numeric column. The following code
example returns the maximum value for the Sales Amount column in the Reseller Sales table:

=MAX('Reseller Sales'[Sales Amount])

The MAXA and MINA functions return the maximum and minimum numeric values in a column, but they
also handle non-numeric logical values such as TRUE and FALSE, which evaluate to 1 and 0 respectively.

MAXX and MINX. Return the maximum and minimum values in a column in a table filtered by using
an expression. For example, the following code returns the minimum value for the Freight column in
the Internet Sales table for which the corresponding SalesTerritoryKey value is 5:

=MINX(FILTER('Internet Sales', [SalesTerritoryKey] = 5),[Freight])

AVERAGE. Returns the mean of all values in a numeric column. For example, the following code
returns the average value in the Total Product Cost column in the Reseller Sales table:

=AVERAGE('Reseller Sales'[Total Product Cost])

The AVERAGEA function does the same thing as AVERAGE, but also handles non-numeric data types.

AVERAGEX. Enables you to supply an expression to obtain the average of values that are not neatly
contained in a single column. For example, the following code sums the Freight and TaxAmt values
for each row in the Internet Sales table and then calculates the average value:

=AVERAGEX('Internet Sales', 'Internet Sales'[Freight]+ 'Internet Sales'[TaxAmt])


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 8-7

Using Aggregations
When you add a column to the Values area in the PowerPivot Field List in a PowerPivot for Excel
worksheet, column values are automatically summed and the total added to the PivotTable table or
PivotChart chart you are using. If you need to summarize the data differently, you can select another type
of aggregation, such as COUNT, MIN, or AVERAGE, from a list. The DAX formula is then automatically
created for you. When you need to create aggregations more complex than this, you can create a
measure.

Context
When you create a DAX measure or calculated
column, you define a field you can use in a
PivotTable or a PivotChart. However, the specific
values appearing in any given PivotTable or
PivotChart will depend on the context, such as the
columns and rows added to a PivotTable or a
PivotChart, or the slicers that are enabled at the
time. For example, if you create a PivotTable and
add only the Sum of Sales Amount measure to
the Values area and nothing to the Row Labels
area, you will see a single value representing the
total sales amount for all categories, years, and
countries in the Reseller Sales channel. If you then add Calendar Year to the Row Labels area, you will
see multiple values, each one a total for a given year. Even though the same measure is used in both
cases, the displayed value changes to reflect the context. This context-aware behavior makes it possible to
perform dynamic analysis of data quickly and easily, without needing to reconfigure PivotTables or
PivotCharts whenever you add a new measure.

It is important to be aware of context when you create measures and calculated columns. If you do not
take context into account, you might find your reports do not aggregate data in the way you intended,
and are inaccurate and misleading.

Types of Context
There are three types of context to consider:

1. Row context. When you create a calculated column in a table, you specify the column or columns
that it should reference. For example, the formula in the following code references the First Name
and Last Name columns in the Employee table:

=CONCATENATE([First Name], CONCATENATE(" ", [Last Name]))

The calculated column evaluates the expression for each row. It uses the row context to obtain the values
in the First Name and Last Name columns for each row, and then uses these values to create a new
value for each row by concatenating them. Row context is important when using any function that
evaluates an expression for each row in a table, such as SUMX.

2. Query context. When you add a measure to a PivotTable table, the xVelocity engine evaluates the
measure value for each cell. The evaluation takes account of the query context to calculate these
values for a given cell. Query context is determined by the row labels, column labels, filters, and slicers
that apply to the PivotTable table. Removing or adding a new slicer, row label, or column label will
change the filter context and the PowerPivot engine must now take account of the new context that
is performing the calculation for each row.
MCT USE ONLY. STUDENT USE PROHIBITED
8-8 Introduction to DAX

3. Filter context. You can use DAX functions to modify the row and query contexts by creating filters in
formulas. The FILTER function enables you to produce a subset of rows on which you can perform a
calculation. For example, you might filter by specifying a product category, then aggregate the sales
amount for that category only. You can override filters by using the ALL and ALLEXCEPT functions.
ALL forces all filters to be ignored, and ALLEXCEPT enables you to selectively override them by
specifying particular columns to retain in the filter contextall other columns are ignored.

DAX Queries
DAX includes extensions that enable you to write
queries that retrieve data. Client reporting tools,
such as Power View, use DAX to query tabular
data models in SQL Server Analysis Services. It is
unlikely that you will need to write DAX queries
manually, but it can be useful to have a basic
understanding to help with troubleshooting.

You can retrieve data from tables by using the


EVALUATE keyword, which is functionally
equivalent to the SELECT statement in SQL Server.
The following code example returns all the rows
from the Reseller Sales table:

EVALUATE('Reseller Sales')

Queries can include FILTER to return a reduced result set, similar to the way in which a WHERE clause
returns a reduced result set in SQL Server. The following code example returns all the rows from the
Reseller Sales table where the OrderDateKey value is greater than 20040101:

EVALUATE(FILTER('Reseller Sales', 'Reseller Sales' [OrderDateKey]>20040101))

You can also order result sets. The following code example returns all the rows from the Reseller Sales
table where the OrderDateKey value is greater than 20040101. The results are ordered by Sales Amount:

EVALUATE(FILTER('Reseller Sales', 'Reseller Sales'[OrderDateKey]>20040101))


ORDER BY 'Reseller Sales'[Sales Amount] ASC
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 8-9

Lesson 2
Enhancing a Tabular Data Model with DAX
DAX creates calculated columns and measures, that you can then use in PivotTable tables and PivotChart
charts in PowerPivot for Excel workbooks. This lesson shows you how to create DAX calculated columns
and measures using a variety of functions, including time intelligence functions and functions that
manipulate table relationships. This lesson also shows you how to create dynamic measures that can use
conditional logic to perform calculations based on different inputs.

Lesson Objectives
After completing this lesson, you will be able to:

Explain how to use calculated columns and measures.

Describe how to work with relationships between tables.

Describe time intelligence.


Describe dynamic measures.

Calculated Columns and Measures


The most common use of DAX in a tabular data
model is to define calculated columns and
measures to your tables. Formatting options for
calculated columns and measures depend on the
data type, and include text, date, decimal number,
whole number, percentage, scientific notation,
currency, and a range of date and time formats.

Calculated Columns
A calculated column is a named column that you
populate by using a DAX formula. The formula will
usually refer to other columns, either in the same
table or in a different, related table, but you can
also create calculated columns based on measures, or on other calculated columns.

The following code example creates a column that calculates profit by subtracting the value in the Total
Product Cost column from the value in the Sales Amount column. You could use this DAX expression to
create a calculated column named Sales Profit.

A DAX Expression for a Calculated Column


=[Sales Amount] - [Total Product Cost]

You can create calculated columns in a tabular data model just as you can in an Excel workbook. In the
Data View window of a data model designer, select the table to which you want to add the column, and
then click Add Column. You can then name the column and type the DAX formula in the formula bar,
which will determine the contents of the column. The xVelocity engine performs the calculation
immediately to populate the column. By default, the engine recalculates columns automatically if the
source data changes.
MCT USE ONLY. STUDENT USE PROHIBITED
8-10 Introduction to DAX

Note: The formula used to create a calculated column applies to every row. You cannot only
apply the formula to selected data ranges as you can in Excel worksheets.

When creating calculated columns, be aware of the following considerations:


Calculated columns calculate a value for every row in a table. This can take a long time when
tables contain a large number of rows.
Changing the names of columns referenced by a calculated column will cause an error.

Measures
A measure is a named formula that can encapsulate complex business logic, usually by aggregating one
or more numerical values.

The following example defines a measure named Profit that calculates the sum of the Sales Profit
column:

A DAX Expression for a Measure


Profit:=SUM([Sales Profit])

Measures are defined in the Measures Grid in Data View of the data model designer. In client tools,
measures are associated with the table in which they have been defined. If you need to create a measure
that is not associated with an existing table, you can paste an Excel worksheet containing only a
placeholder column header value into the data model as a new, empty table. You can then define
measures in the Measure Grid for the empty table.

Key Performance Indicators


Key Performance Indicators (KPIs) are used to
compare measures with a given goal or target
value to assess business performance. For example,
a company might set a profit margin goal of 15
percent, and use a KPI to indicate how actual sales
results compare.
In tabular data models, KPIs are defined based on
measures, and can consist of the following
elements:
Base Value: The actual value being assessed,
calculated by a measure.

Target Value: The goal against which the base value is compared. This can be an absolute value or
calculated by a measure.

Status Thresholds: The status is an indicator of the base values performance against the target
value. You can set thresholds that establish the status based on the percentage of the target value
achieved by the base value. For example, you could use a red indicator if the base value is below 20
percent of the target value, a yellow indicator if it is between 20 and 80 percent, and a green
indicator if it is above 80 percent.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 8-11

Demonstration: Creating Calculated Columns and Measures


In this demonstration, you will see how to:

Create Calculated Columns.

Create Measures.
Create a KPI.

Analyze Calculated Columns and Measures.

Demonstration Steps
Create Calculated Columns

1. Ensure that the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and log
into the 20466C-MIA-SQL virtual machine as ADVENTUREWORKS\Student with the password
Pa$$w0rd. Then, in the D:\Demofiles\Mod08 folder, right-click Setup.cmd and click Run as
administrator. When prompted, click Yes.

2. Start Visual Studio and open the TabularDemo.sln solution in the D:\Demofiles\Mod08 folder. If the
Tabular model designer dialog box is displayed, in the Workspace server list, specify
localhost\SQL2 and then click OK. Then in Solution Explorer, double-click Model.bim to open the
data model designer. If you are prompted to run a script on the server, click Yes.
3. On the Model menu, point at Process, and click Process All. If you are prompted for impersonation
credentials, specify the user name ADVENTUREWORKS\ServiceAcct with the password Pa$$w0rd
and click OK. Then when all of the tables have been processed, click Close.
4. On the Customer tab, in the first empty column after the existing columns, double-click Add
Column and name the new column Full Name.

5. With the Full Name column selected, in the formula bar, enter the following DAX expression:

=CONCATENATE([FirstName], " " & [LastName])

6. Wait for the table to finish updating, and note the calculated values in the new column.

7. Click the FirstName column heading, hold Shift and click the LastName column heading, and then
right-click either of the selected column headings and click Hide from Client Tools.
8. On the Internet Sales tab, in the first empty column after the existing columns, double-click Add
Column and name the new column SalesProfit.

9. With the SalesProfit column selected, in the formula bar, enter the following DAX expression:

=[SalesAmount]-[TotalProductCost]

10. Wait for the table to finish updating, and note the calculated values in the new column.

Create Measures

1. On the Internet Sales tab, click the first empty cell in the measure grid under the SalesProfit
column.

2. In the formula bar, enter the following DAX expression:

Profit:=SUM([SalesProfit])

3. Select the cell containing the Profit measure, and press F4 to display the properties pane. Then set
the Format property to Currency.
MCT USE ONLY. STUDENT USE PROHIBITED
8-12 Introduction to DAX

4. Click the empty cell in the measure grid under the Profit measure.

5. In the formula bar, enter the following DAX expression:

Margin:=[Profit]/[Revenue]

6. Select the cell containing the Margin measure, and press F4 to display the properties pane. Then set
the Format property to Percentage.

7. Right-click the SalesProfit column header and click Hide from Client Tools.

8. On the File menu, click Save All.

Create a KPI

1. On the Internet Sales tab, click the empty cell in the measure grid under the Margin measure.
2. In the formula bar, enter the following DAX expression:

Target Margin:=(SUM('Product'[ListPrice]) - SUM('Product'[StandardCost])) / SUM('Product'[ListPrice])

3. Select the cell containing the Margin measure, and press F4 to display the properties pane. Then set
the Format property to Percentage.

4. On the Internet Sales tab, right-click the cell in the measure grid containing the Margin measure
and click Create KPI.

5. In the Key Performance Indicator (KPI) dialog box, note that the KPI base measure (value) is
defined by the Margin measure. Then, under Define target value, ensure that Measure is selected
and select Target Margin.

6. Set the first status threshold to 65% and the second to 90%.

7. Note the default icon style, and click OK.

8. On the File menu, click Save All.

Analyze Columns and Measures


1. On the Model menu, click Analyze in Excel.

2. In the Analyze in Excel dialog box, ensure that Current Windows User is selected and that the
(Default) perspective is specified, and click OK.

3. When Excel opens, in the Internet Sales table, select the Profit and Margin measures so that they
are summarized in the PivotTable.

4. In the Customer table, select Full Name so that the PivotTable shows profit and margin by customer.

5. In the PivotTable Fields pane, expand KPIs, expand Margin, and select Status so that an indicator
shows the KPI status for each customer.

6. Close Excel without saving the workbook.

7. Keep Visual Studio open for the next demonstration.


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 8-13

Working with Related Tables


Most data models consist of multiple tables that
are related, based on common key values. There
are many cases where a DAX expression needs to
combine data from related tables, and DAX
provides some functions to support this
requirement.

Looking Up Related Data Values


DAX provides a LOOKUPVALUE function you can
use to return a column value from a table, based
on a specified value for another column. For
example, if a Customer table contains a
Registration Date column, and a Date table
contains Full Date and Fiscal Year columns, you could use the following expression in a calculated
column in the Customer table to look up the fiscal year in which the customer registered:

LOOKUPVALUE('Date'[Fiscal Year], [Full Date], [Registration Date])

The LOOKUPVALUE does not require an existing relationship between tables, it simply looks up a column
value based on the specified value for another column in the same table. If a relationship is defined
between two tables, a more effective way to look up related values is to use the RELATED function. For
example, if a Product table is related to a Product Category table, based on a common CategoryKey
column, you could use the following expression to return the Category Name value from the Category
table into a calculated column in the Product table:

RELATED('Product Category'[Category Name])

Note that the RELATED function does not require that you specify a lookup value. It automatically uses
the active relationship between the tables to find the related value. The RELATED function works well
when you need to look up a single value from the many side of a one-to-many relationship. In the
example above, each category exists once in the Product Category table but many times in the Product
table.

Occasionally, you might want to aggregate many related values from the one side of the relationship.
For example, in the Product Category table, you might want to create a calculated column named
Product Count containing the total number of products in each category. To accomplish this task, you
can use the RELATEDTABLE function shown in the following example:

COUNTX(RELATEDTABLE('Product'), [ProductKey])

Referencing Tables with Multiple Relationships


Tabular data models support multiple relationships between tables. This is useful if, for example, the Date
and Sales tables have multiple relationships to support different dates such as order date, due date, and
delivery date. Multiple relationships appear in the diagram view of the data model designer, with the
active relationship shown as an unbroken line. Non-active relationships appear as dotted lines.
MCT USE ONLY. STUDENT USE PROHIBITED
8-14 Introduction to DAX

Only one relationship can be active and this will automatically be used when performing calculations
using data from the two tables. You can use the USERELATIONSHIP function when you are writing DAX
formulas to override this behavior and force PowerPivot to use the relationship that you specify. For
example, if the default relationship between your fact table and date table is based on the Order Date
column, but you need to aggregate data by shipping date, you can write a DAX measure containing the
USERELATIONSHIP function to achieve this.

Note: USERELATIONSHIP does not create a relationship. The relationship must already exist
for you to use it.

For example, the following code calculates the sum of the Sales Amount column based on shipping date:

=CALCULATE(SUM(Reseller Sales[Sales Amount]), USERELATIONSHIP('Reseller Sales'[ShipDateKey],


'Date'[DateKey]))

Note: You can only use USERELATIONSHIP with certain other functions, such as
CALCULATE, which can accept a filter as an argument.

Note: For more information about the USERELATIONSHIP function, go to


http://go.microsoft.com/fwlink/?LinkID=246793.

Demonstration: Using Relationships


In this demonstration, you will see how to:

Retrieve a Value from an Unrelated Table.

Retrieve a Value from a Related Table.

Aggregate Related Values.

Analyze Related Data.

Demonstration Steps
Retrieve a Value from an Unrelated Table

1. Ensure that you have completed the previous demonstration.

2. In Visual Studio, with the Model.bim pane open in data view, on the Customer tab, after the existing
columns, double-click the Add Column header and enter the column name First Fiscal Year.

3. With the First Fiscal Year column selected, in the formula bar, enter the following DAX expression:

=LOOKUPVALUE('Order Date'[FiscalYear], [Date], [DateFirstPurchase])

4. Wait for the table to finish updating, and note the calculated values in the new column.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 8-15

Retrieve a Value from a Related Table

1. On the Customer tab, after the existing columns, double-click the Add Column header and enter the
column name Country-Region.

2. With the Country-Region column selected, in the formula bar, enter the following DAX expression:

=RELATED('Geography'[CountryRegionCode])

3. Wait for the table to finish updating, and note the calculated values in the new column.

Aggregate Related Values

1. On the Customer tab, after the existing columns, double-click the Add Column header and enter the
column name Order Count.

2. With the Order Count column selected, in the formula bar, enter the following DAX expression:

=COUNTAX(RELATEDTABLE('Internet Sales'), [SalesOrderNumber])

3. Wait for the table to finish updating, and note the calculated values in the new column.

4. On the File menu, click Save All.


Analyze Related Data

1. On the Model menu, click Analyze in Excel.

2. In the Analyze in Excel dialog box, ensure that Current Windows User is selected and that the
(Default) perspective is specified, and click OK.

3. When Excel opens, in the Internet Sales table, select the Revenue measure.

4. In the Customer table, select First Fiscal Year so that the PivotTable shows sales revenue by
customer based on the first year in which customers made a purchase.

5. In the Customer table, clear First Fiscal Year and select Country-Region so that the PivotTable
shows sales revenue by customer country or region.
6. In the Customer table, clear Country-Region and select Order Count so the PivotTable shows sales
revenue by customer based on the number of orders placed.

7. Close Excel without saving the workbook.


8. Keep Visual Studio open for the next demonstration.

Introduction to Time Intelligence


Time intelligence functions enable you to compare
data from one time period against equivalent data
from another. For example, you might want to
compare sales for the first quarter of this year
against sales for the first quarter of last year.

To support time intelligence functionality, you


should ensure that your data and tables meet the
following criteria:

The model should have a separate table that


contains only date information.
MCT USE ONLY. STUDENT USE PROHIBITED
8-16 Introduction to DAX

The date table should have a continuous range of dates without gaps.

The date table column that uses the date data type should use day as the lowest level of granularity.

Relationships between a table containing a measure and a date table do not need to be based on
columns that use the date data type to support time intelligence functionality. Instead, you can base these
relationships on columns containing integer values. This is useful because it is likely that the databases
you use as sources for analysis will contain fact and date tables that are related by columns using the
integer data type, particularly if the source database is a data warehouse. For example, in the
AdventureWorksDW database, the FactResellerSales table relates to the DimDate table on the
DateKey column. To handle time intelligence correctly when integers are used in relationships, you must
manually mark the date table as Date Table and specify the column in the table that contains the date
values. This column must use the date data type.

Time Intelligence Functions


DAX provides a range of time intelligence functions that support common business analysis scenarios.
These include (among many others):

NEXTMONTH, NEXTQUARTER, NEXTYEAR. These functions return a table that contains a range of
dates representing the time period after the current context.

PREVIOUSMONTH, PREVIOUSQUARTER, PREVIOUSYEAR. These functions return a table that contains


a range of dates representing the time period preceding the current context.

SAMEPERIODLASTYEAR, PARALLELPERIOD. These functions enable you to compare measure values


across the same period in different timespans. For example, to compare sales for a month in the
current year with sales in the same month the previous year.

TOTALMTD, TOTALQTD, TOTALYTD. These functions calculate a running total for the corresponding
time period. For example, you could use TOTALQTD to calculate sales for the quarter so far.

Note: For more information about Time Intelligence Functions, go to Time Intelligence
Functions (DAX) in SQL Server Books Online at http://technet.microsoft.com/en-
us/library/ee634763.aspx.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 8-17

Demonstration: Using Time Intelligence


In this demonstration, you will see how to:

Mark a Table as a Date Table.

Create a Measure that uses Time Intelligence.


Analyze Data Using Time Intelligence.

Demonstration Steps
Mark a Table as a Date Table

1. Ensure you have completed the previous demonstration.

2. In Visual Studio, with the Model.bim pane open in data view, click the Order Date tab.

3. On the Table menu, point at Date and click Mark As Date Table.

4. In the Mark As Date Table dialog box, select the Date column, and click OK.

Create a Measure that uses Time Intelligence

1. In the model designer, click the Internet Sales tab.

2. Click the first empty cell in the measure grid.

3. In the formula bar, enter the following DAX expression:

Previous Year Revenue:=CALCULATE(SUM([SalesAmount]), SAMEPERIODLASTYEAR('Order Date'[Date]))

4. Select the cell containing the Previous Year Revenue measure, and press F4 to display the properties
pane. Then set the Format property to Currency.

5. On the File menu, click Save All.

Analyze Data Using Time Intelligence


1. On the Model menu, click Analyze in Excel.

2. In the Analyze in Excel dialog box, ensure that Current Windows User is selected and that the
(Default) perspective is specified, and click OK.
3. When Excel opens, in the Internet Sales table, select the Revenue and Previous Year Revenue
measures.

4. In the Order Date table, drag Calendar Date to the Rows area so that the PivotTable shows sales
revenue and previous year revenue for each year.

5. Expand 2007 and 2008, and note that the revenue for each quarter in 2007 is shown as the previous
year revenue for the same quarters in 2008.
6. Close Excel without saving the workbook.

7. Close Visual Studio.


MCT USE ONLY. STUDENT USE PROHIBITED
8-18 Introduction to DAX

Implementing Parent-Child Hierarchies


There is no inherent support for a parent-child
hierarchy within a tabular data model. To define a
parent-child hierarchy you must use DAX
expressions including functions that can be used
to specify the path of parent and child IDs and the
distinct levels you want to support within the
hierarchy. Understanding the functions available is
important because parent-child hierarchies can be
commonplace within an analytical data model. To
create a parent-child hierarchy in a tabular data
model, you must use the PATH, LOOKUPVALUE,
and PATHITEM DAX functions.

The PATH function


The PATH function returns a value to denote the entire parent values related to the key value on which
the PATH statement is based. You can create a calculated measure that stores the value of the path item.
For example, you could create an additional column named EmployeeLevels that stores this value in an
Employee table. The PATH function has the following syntax:

PATH(<id_columnName>, <parent_columnName>)

<id_columnName> refers to the key column in a table. For example, in an Employees table, the key
column might be EmployeeID.

<parent_columnName> refers to the parent column for a key column in a table. For example, in the
Employees table, this might be ManagerID.

Note: There are additional DAX functions similar to PATHITEM, including


PATHITEMREVERSE, PATHLENGTH, and PATHCONTAINS. For more information, go to SQL Server
Books Online.

The LOOKUPVALUE and PATHITEM functions


To return a specific value and more meaningful information within a parent-child hierarchy from the
PATH function, you can use the LOOKUPVALUE and PATHITEM functions together. The LOOKUPVALUE
function returns the value to display and PATHITEM determines from which level the value should be
returned. For example, instead of returning an EmployeeID and a ManagerID in a parent-child hierarchy,
both functions could be used together to return the Name attribute of an employee.

The LOOKUPVALUE function has the following syntax:

LOOKUPVALUE( <result_columnName>, <search_columnName>, <search_value>[,


<search_columnName>, <search_value>])

<result_columnName> is the name of an existing column that contains the value you want to return.

<search_columnName> is the name of an existing column on which the lookup is performed.

<search_value> provides a filter for the LOOKUPVALUE function. This can include a string literal value
or another function, such as PATHITEM, to filter the data.

Optionally, additional <search_columnName> and <search_value> parameters can be defined.

The PATHITEM function has the following syntax:


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 8-19

PATHITEM(<path>, <position>[, <type>])

<path> refers to a column that contains the results of the PATH function. In the example of an
Employees table, this could be a column named EmployeeLevels.

<position> is an integer expression referring to the position of the item to be returned.

<type> is an optional parameter that can be used to determine the data type that the result should
be returned in. A value of 0 is text which is the default and a value of 1 is an integer data type.

You can use the LOOKUPVALUE and PATHITEM functions together to create a parent-child hierarchy. An
additional calculated measure can be created in a column named Level1 that contains the following code
to populate the Name value for an employees immediate manager:

LOOKUPVALUE ([Name], [EmployeeId], PATHITEM ([EmployeeLevels], 1))

The calculated measure to create the second level of the employee hierarchy in a column named Level2 is
shown in the following code example:

LOOKUPVALUE ([Name], [EmployeeId], PATHITEM ([EmployeeLevels], 2))

When the new calculated measures are defined to represent the different levels of an employee hierarchy,
you can use the Create Hierarchy button in the data model designer, and then click and drag each level
into the new hierarchy.

Design considerations
When you design parent-child hierarchies in analytical data models, consider the following points:

Ensure that the parent and child keys are of compatible data types.

Ensure that a self-join relationship exists between the parent key and the child for best query and
processing performance.
MCT USE ONLY. STUDENT USE PROHIBITED
8-20 Introduction to DAX

Lab: Using DAX to Enhance a Tabular Data Model


Scenario
A senior business analyst at Adventure Works Cycles wants to perform some detailed analysis of the data
in tabular data model. To accomplish this, you will need to use DAX to enhance the data model.

Objectives
After completing this lab, you will be able to:

Use DAX to create calculated columns.

Use DAX to create measures.

Create a KPI that uses a DAX expression for the target value.

Use DAX to implement a parent-child hierarchy.

Estimated Time: 60 minutes

Virtual machine: 20466C-MIA-SQL

User name: ADVENTUREWORKS\Student


Password: Pa$$w0rd

Exercise 1: Creating Calculated Columns


Scenario
Business users have requested the ability to:

View measures by employee, displaying employee full name.


View profit for each sale.

View measures by a product category hierarchy.

The main tasks for this exercise are as follows:


1. Prepare the Lab Environment

2. Concatenate Text Values

3. Calculate a Numeric Value


4. Retrieve Related Values

5. View Calculated Columns in Excel

Task 1: Prepare the Lab Environment


1. Ensure the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then log
on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. Run Setup.cmd in the D:\Labfiles\Lab08\Starter folder as Administrator.

Task 2: Concatenate Text Values


1. Use Visual Studio to open the AWSalesTab.sln solution in the D:\Labfiles\Lab08\Starter folder, and
view the data model designer. If the Tabular model designer dialog box is displayed, in the
Workspace server list, specify localhost\SQL2 and then click OK.

2. View the model, running a script in the server if prompted. And then process the model using the
impersonation credentials ADVENTUREWORKS\ServiceAcct with the password Pa$$w0rd.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 8-21

3. In the Employee table, create a custom column named Employee Name that concatenates the first
name, middle name (if there is one), and last name of the employee. You can use the following DAX
expression to do this:

=[First Name] & IF(ISBLANK([Middle Name]), "", CONCATENATE(" ", [Middle Name])) & CONCATENATE(" ",
[Last Name])

4. After you have defined the Employee Name column, hide the First Name, Middle Name, and Last
Name columns from client tools and save the data model.

Task 3: Calculate a Numeric Value


1. In the Reseller Sales table, create a custom column named SalesProfit that calculates profit by
subtracting the total product cost from the sales amount. You can use the following DAX expression
to do this:

=[SalesAmount]-[TotalProductCost]

2. After you have defined the SalesProfit column, confirm that its data type has been automatically
detected as Currency, and then save the model.

Task 4: Retrieve Related Values


1. In the Product table, create a column named Subcategory that returns the related Subcategory
column from the Product Subcategory table. You can use the following DAX expression to do this:

=RELATED('Product Subcategory'[Subcategory])

Note: Some products at the beginning of the table are uncategorized.


2. In the Product table, create a column named Category that returns the related Category column
from the Product Category table. You can use the following DAX expression to do this:

=RELATED('Product Category'[Category])

Note: Some products at the beginning of the table are uncategorized.


3. In the Product table, create a hierarchy named Categorized Products that includes the following
columns:

o Category

o Subcategory

o Product Name

4. Hide all columns in the Product Subcategory and Product Category tables from client tools.

5. Save the data model.

Task 5: View Calculated Columns in Excel


1. Analyze the model you have created in Excel.

2. View the Revenue measure by the Categorized Products hierarchy on rows, and the Employee
Name column on columns.

3. Verify that the calculated columns work as expected, and then close Excel without saving the
workbook.
MCT USE ONLY. STUDENT USE PROHIBITED
8-22 Introduction to DAX

Results: After this exercise, you should have a calculated column named Employee Name in the
Employee table, a calculated measure named SalesProfit in the Reseller Sales table, and a hierarchy
named Categorized Products in the Product table.

Exercise 2: Creating Measures


Scenario
Business users have requested the ability to view the following aggregated measures:

Profit

Margin
Previous Year Revenue

The main tasks for this exercise are as follows:

1. Create a Measure that Aggregates a Column

2. Create a Measure that References Other Measures

3. Create a Measure that Uses Time Intelligence

4. View Measures in Excel

Task 1: Create a Measure that Aggregates a Column


1. In the Reseller Sales table, create a measure named Profit that sums the values in the SalesProfit
column. You can use the following DAX expression to do this:

Profit:=SUM([SalesProfit])

2. Format the Profit measure as Currency.


3. Hide the SalesProfit column from client tools.

4. Save the data model.

Task 2: Create a Measure that References Other Measures


1. In the Reseller Sales table, create a measure named Margin that divides the Profit measure by
Revenue. You can use the following DAX expression to do this:

Margin:=[Profit]/[Revenue]

2. Format the Margin measure as Percentage.

3. Save the data model.

Task 3: Create a Measure that Uses Time Intelligence


1. Mark the Order Date table as a date table, specifying the Date column.

2. In the Reseller Sales table, create a measure named Previous Year Revenue that uses the
SAMEPERIODLASTYEAR function to calculate the sum of the SalesAmount column for the equivalent
time period in the previous year. You can use the following DAX expression to do this:

Previous Year Revenue:=CALCULATE(SUM([SalesAmount]), SAMEPERIODLASTYEAR('Order Date'[Date]))

3. Format the Previous Year Revenue measure as Currency.

4. Save the data model.


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 8-23

Task 4: View Measures in Excel


1. Analyze the model you have created in Excel.

2. View the Revenue, Profit, Margin, and Previous Year Revenue measures by the Calendar Date
hierarchy on rows.

3. Verify that the measures you created all work as expected, and then close Excel without saving the
workbook.

Results: At the end of this exercise, the Reseller Sales table should contain the following measures:

Profit

Margin

Previous Year Revenue

Exercise 3: Creating a KPI


Scenario
Adventure Works wants to achieve year-on-year revenue growth of 20 percent. To track this goal, you
plan to create a KPI.

The main tasks for this exercise are as follows:

1. Create a Measure to Calculate a KPI Goal

2. Create a KPI

3. View a KPI in Excel

Task 1: Create a Measure to Calculate a KPI Goal


1. In the Reseller Sales table, create a measure named Revenue Goal that multiplies the Previous Year
Revenue measure by 1.2. You can use the following DAX expression to do this:

Revenue Goal:=[Previous Year Revenue] * 1.2

2. Format the Revenue Goal measure as Currency and hide it from client tools.

3. Save the data model.

Task 2: Create a KPI


1. In the Reseller Sales table, create a KPI based on the Revenue measure.

2. Configure the KPI to use the Revenue Goal measure as the target value, and set status thresholds at
75% and 95%.

3. When you have created the KPI, save the data model.

Task 3: View a KPI in Excel


1. Analyze the model you have created in Excel.

2. View the Value, Goal, and Status of the Revenue KPI by the Fiscal Date hierarchy on rows.

3. Verify that the KPI indicates the status appropriately for the quarters and months in 2008, and then
close Excel without saving the workbook.
MCT USE ONLY. STUDENT USE PROHIBITED
8-24 Introduction to DAX

Results: At the end of this exercise, the Reseller Sales table should include a measure named Revenue
Goal and a KPI based on the Revenue measure.

Exercise 4: Implementing a Parent-Child Hierarchy


Scenario
The CEO at Adventure Works wants to view sales results aggregated by each sales manager, and then drill
down to view individual employee sales results.

The main tasks for this exercise are as follows:

1. Create a Path Column

2. Create a Column for Each Hierarchy Level

3. Use the Calculated Columns in a Hierarchy

4. View a Parent-Child Hierarchy in Excel

Task 1: Create a Path Column


1. In the Employee Table, add a column named Path that uses the PATH function to define the
recursive chain of parent employee keys. You can use the following DAX expression to do this:

=PATH([EmployeeKey], [ParentEmployeeKey])

2. Save the data model.

Task 2: Create a Column for Each Hierarchy Level


1. In the Employee Table, add the following calculated columns:

o Level1

=LOOKUPVALUE ([Employee Name], [EmployeeKey], PATHITEM ([Path], 1, 1))

o Level2

=LOOKUPVALUE ([Employee Name], [EmployeeKey], PATHITEM ([Path], 2, 1))

o Level3

=LOOKUPVALUE ([Employee Name], [EmployeeKey], PATHITEM ([Path], 3, 1))

o Level4

=LOOKUPVALUE ([Employee Name], [EmployeeKey], PATHITEM ([Path], 4, 1))

2. Save the data model.

Task 3: Use the Calculated Columns in a Hierarchy


1. Create a hierarchy named Employee Hierarchy in the Employee table.

2. Add the Level1, Level2, Level3, and Level4 columns to the hierarchy in that order.

Task 4: View a Parent-Child Hierarchy in Excel


1. Analyze the model you have created in Excel.

2. View the Revenue measures by Employee Hierarchy.


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 8-25

3. Expand the employee hierarchy to view sales totals for managers and their subordinates. Note that
the personal revenue for sales managers is shown with a blank employee name under the total for
that sales manager.

4. When you have finished, close Excel without saving the workbook, and close Visual Studio.

Results: At the end of this exercise, the Employee table should include a hierarchy named Employee
Hierarchy.
MCT USE ONLY. STUDENT USE PROHIBITED
8-26 Introduction to DAX

Module Review and Takeaways


In this module, you have learned how to use DAX to enhance a tabular data model.

Review Question(s)
Question: Will time intelligence functionality be useful to you? What kinds of time-based
analyses might you want to perform?
MCT USE ONLY. STUDENT USE PROHIBITED
9-1

Module 9
Implementing Reports with SQL Server Reporting Services
Contents:
Module Overview 9-1

Lesson 1: Introduction to Reporting Services 9-2

Lesson 2: Creating a Report with Report Designer 9-6

Lesson 3: Grouping and Aggregating Data in a Report 9-16

Lesson 4: Publishing and Viewing a Report 9-23

Lab: Creating a Report with Report Designer 9-27

Module Review and Takeaways 9-32

Module Overview
One of the most common applications of Business Intelligence (BI) is to provide information to business
users by way of reports. Microsoft SQL Server 2014 includes SQL Server Reporting Services, a
comprehensive platform for creating, publishing, and managing reports. The ability to design, develop,
and deliver accurate and effective reports is a valuable skill for any BI developer. Familiarity with SQL
Server Reporting Services will help you gain that skill.

Objectives
This module discusses the tools and techniques that a professional BI developer can use to create and
publish reports. After completing this module, you will be able to:

Describe the key features of Reporting Services.

Use Report Designer to create a report.

Group and aggregate data in a report.

Publish and view a report.


MCT USE ONLY. STUDENT USE PROHIBITED
9-2 Implementing Reports with SQL Server Reporting Services

Lesson 1
Introduction to Reporting Services
SQL Server Reporting Services is a component of SQL Server that provides a platform for building and
managing a reporting solution. Reporting Services is available in most editions of SQL Server, including
SQL Server Express with Advanced Services, though not all features are supported in all editions.

Before you start building a reporting solution, it is important to be familiar with common reporting
scenarios and how Reporting Services can be used to support them. This lesson provides an overview of
how Reporting Services can be used to provide a reporting solution in a business organization.

Lesson Objectives
After completing this lesson, you will be able to:

Describe common reporting scenarios.

Describe the modes in which Reporting Services can be deployed.


Identify appropriate reporting tools for roles in a reporting scenario.

List the report rendering formats supported by Reporting Services.

Reporting Scenarios
Every organization is different, with its own
specific requirements for business reporting.
However, there are some common reporting
scenarios with which BI developers should be
familiar in order to design appropriate reporting
solutions.

Common reporting scenarios found in many


organizations include:
Scheduled delivery of standard reports. BI
developers create a set of standard business
reports that users receive by email or delivery
to a file share on a regular basis.

On-demand access to standard reports. Business users consume reports on demand by browsing
to an online report repository.

Embedded reports and dashboards. Reports are integrated into an application or portal to provide
contextualized business information at a glance.

Request to IT for custom reports. Business users request specific reports from the IT department,
and a BI developer creates the reports to order.

Self-service reporting. Business users can use report authoring tools to create their own reports,
often based on data sources and datasets published by the IT department.

Most reporting solutions address one or more of these scenarios. Before starting development on a
reporting solution, it is important to understand the scenarios that it must support.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 9-3

Reporting Services Modes


When installing SQL Server Reporting Services, you
must choose between two possible deployment
modes:

SharePoint integrated mode. In Microsoft


SharePoint integrated mode, the report
server is installed as a SharePoint 2013 service
in a SharePoint Server farm. Users manage
and view reports on a SharePoint site.

Native mode. In native mode, Reporting


Services provides a management and report-
viewing user interface called Report Manager,
implemented as a stand-alone web
application.

In business environments where SharePoint technologies are used, SharePoint Integrated mode is the
preferred deployment mode for SQL Server Reporting Services. SharePoint integration has been
completely reworked in this release of SQL Server, and provides the following benefits:

Centralized configuration management in SharePoint Central Administration.

Scale-out support for integrated Reporting Services applications.

A common security identity for Reporting Services and other SharePoint shared services through the
SharePoint Shared Service application pool.

Support for claims-based authentication.

SharePoint cross-farm support for viewing reports.

Integration with SharePoint backup and logging features.

Reporting Roles and Authoring Tools


SQL Server Reporting Services supports multiple
types of report author by providing a range of
authoring tools and environments.

Reporting Roles
Formal names for roles in a reporting scenario can
vary across organizations. However, the following
roles describe common categories of user involved
in developing or consuming reports:

BI Developer. A professional business


intelligence developer who creates
sophisticated reports for business users, and
who is familiar with the underlying data warehouse, data marts, and other data sources used. Job
roles that fall into this category include BI specialists such as report developers and data warehouse
developers.
MCT USE ONLY. STUDENT USE PROHIBITED
9-4 Implementing Reports with SQL Server Reporting Services

IT Professional. A technical employee usually responsible for managing IT infrastructure but, in small
organizations or departments, often required to create reports for business users. Job roles that fall
into this category include technology-based employees, such as IT administrators, database
developers, and programmers.

Power User. A business user that is not a technical professional but has a good knowledge of IT
systems and principles, as well as the ability to use advanced features of tools and programs. Job roles
that fall into this category include users who need to perform detailed analysis of data, such as
business analysts and financial accountants.
Information Worker. A business user who relies on information generated from IT systems to
perform their job, but who has no particular skills with IT tools or systems. Job roles that fall into this
category include anyone who consumes data as an aid to decision-making as part of their daily
activity, such as sales employees, customer support engineers, and business executives.

Authoring Tools
SQL Server provides the following report authoring tools and environments:
Report Designer. A professional report development interface integrated into the Visual Studio
development environment and packaged with other related tools in SQL Server Data Tools for BI.
Report Designer provides support for project-based development, enabling you to create multiple
data sources, datasets, and reports as part of a single project. You can also take advantage of source
control support in Visual Studio Team Foundation Server. Report Designer is the preferred
development tool for BI professionals who need to create and maintain business reports for an
organization.

Report Builder. A ClickOnce application that users can install directly from the Report Manager or
SharePoint document library where they view reports. Report Builder provides an easy to use, feature-
rich report development environment that supports reusable elements and flexible content layout.
Report Builder provides enough sophistication for IT professionals and power users to create detailed
business reports, but does not require the user to install and utilize a fully-featured development
environment.

Power View. An interactive data visualization environment in which users can drag data entities into
a previously created data model to produce graphical representations of business data. Power View is
designed to support reporting environments where the emphasis is on the visual exploration of data
rather than the creation of detailed, formal reports. Power View is an ideal tool for information
workers who want to explore data visually in an intuitive, easy to use environment.

Note: The rest of this module focuses on professional report development with Report
Designer. Self-service reporting with Report Builder and Power View are discussed in Course
20467C: Designing Self-Service BI and Big Data Solutions.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 9-5

Report Rendering Formats


When planning a reporting solution, you must
consider the ways in which reports will be
consumed. Typically, reports are consumed in one
of the following three ways:

Online users view the report in a web


browser.
As a document in an application users open
the report in an application, such as Microsoft
Excel.
Printed users print the report and view it in
hard copy.

When you publish a report to a Reporting Services server, it can be rendered in an interactive HTML-
based report viewer that enables users to consume the report online. However, users can also export the
report to a number of formats, and print it for offline viewing. You can also deliver reports in specific file
formats through subscriptions, where users do not need to view the report online. They can open the
report directly in an application for viewing or printing.

Reporting Services supports this range of options for viewing reports by using an extensible rendering
architecture, in which a report can be provided in any format for which a renderer is available. Each
renderer has its own strengths, limitations, and features so understanding which renderers must be
supported in your reporting scenario can help optimize reports to suit the pagination, image support, and
interactivity available in your target formats. By default, Reporting Services includes renderers for the
following formats:

Microsoft Excel (2003 or 2007-2010)

Microsoft Word (2003 or 2007-2010)

Comma-Separated Value (CSV)

XML

Web Archive

Image

Adobe Acrobat (PDF)

Atom Feed

Note: When developing reports that might be rendered in multiple formats, test each
render format to ensure that the pagination and formatting provides the desired output.
MCT USE ONLY. STUDENT USE PROHIBITED
9-6 Implementing Reports with SQL Server Reporting Services

Lesson 2
Creating a Report with Report Designer
Report Designer is a professional report authoring interface for the Visual Studio development
environment. You can use Report Designer to build and publish a report by creating a Reporting Services
project with Visual Studio.

This lesson describes how to use Report Designer, and explains the key elements of a report that you need
to develop with it.

Lesson Objectives
After completing this lesson, you will be able to:

Describe the structure of a report.

Use the Report Wizard to create a report.

Identify key elements of the Report Designer interface.


Work with data sources.

Work with datasets.

Describe the different kinds of tablix data regions you can use in a report.

What Is a Report?
A report is an XML document that complies with
the Report Definition Language (RDL) schema and
includes a description of how data should be
presented by a report renderer. A report includes
the following elements:

Body. A description of the items included in


the report and how they should be formatted
and presented.

Data Sources. Connection information to


access the data sources where the report data
is stored. Data sources can be embedded in
the report, or stored in a separate file and
shared across multiple reports.

Datasets. Query definitions that determine the data fields presented in the report. Each dataset is
associated with a data source, and can be embedded in the report or stored in a separate file and
shared across multiple reports.

You can author reports in any text editor by typing the appropriate RDL elements and attributes.
However, in most cases, you should use the Report Designer authoring environment.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 9-7

Using the Report Wizard


You can use the Report Wizard to create a
complete report, or to quickly establish a starting
point from which you will develop a more
complex report.

The Report Wizard is started when you perform


one of the following actions:
Create a new project in Visual Studio by using
the Report Server Project Wizard template.

Add a report to an existing Report Server


project by selecting the Report Wizard
template or right-clicking the Reports folder
in Solution Explorer and click Add New Report.
After clicking Next on the Welcome to the Report Wizard page, the following are presented to gather
the information needed to create your report:

1. Select the Data Source. On this page, you can select an existing shared data source in the project or,
if you create a new one, you must specify the connection information and credentials to use when
accessing the data source.

2. Design the Query. On this page, you can enter a query to define a dataset for your report. The
specific query syntax depends on the type of data source selected in the previous step. If the data
source supports it, you can use the Query Builder graphical tool to create the query.

3. Select the Report Type. On this page, you can choose to create a tabular or matrix report. Tabular
reports display data in rows, while matrix reports show data fields grouped in rows and columns, with
aggregated values in the intersecting cells, similar to a PivotTable or crosstab.

4. Design the Table/Matrix. The specific page displayed at this point depends on your choice of report
type. In either case, this page enables you to specify the fields and groupings in your table or matrix.

5. Choose the Table Layout. If you chose to create a tabular report, you can use this page to control
how groupings are displayed, display subtotals for grouped numeric values, and enable interactive
drill-down functionality that report viewers can use to expand or collapse grouped rows within the
table.

6. Choose the Table/Matrix Style. On this page, you can choose from a number of predefined visual
formats for your report.

7. Choose the Deployment Location. On this page, you must specify the URL of the report server
where the report will be published.

8. Complete the Report Wizard. On this final page, you must specify a name for the report. Then you
can view a summary of the options you have selected and elect to preview the report when it is
created. Clicking Finish on this page generates the report.
MCT USE ONLY. STUDENT USE PROHIBITED
9-8 Implementing Reports with SQL Server Reporting Services

The Report Designer Interface


After you have created a report with the Report
Wizard or added a blank one to a Report Server
project, you can edit the report in the Report
Designer interface. Key elements of the Report
Designer interface include:

Solution Explorer. This pane shows the


projects and associated items in the currently
open Visual Studio solution. For a Report
Server project, it shows the reports, shared
data sources, shared datasets, and other
resources.

The Report Design Surface. This is the


graphical report development area on the design tab of the Report Designer. You use this part of the
interface to design a report.

The Report Body. This represents the body of your report, where you can drag items to define the
layout.

The Properties Pane. This pane shows the properties of the selected item.

The Report Data Pane. This pane shows the data sources, datasets, parameters, and images you have
defined, as well as built-in fields you can use in any report.

The Toolbox. This pane shows the report items you can place in the report body.

The Grouping Pane. This pane shows the data groupings defined in your report, and enables you to
define and manage groupings to aggregate report data.

The Preview Tab. This tab previews the report, enabling you to verify how it is rendered before
publishing.

Working with Data Sources


A report requires one or more data sources that:

Defines the connection string used to access


the database where report data is stored.

Specifies the credentials to use when


connecting to the database.

Can be shared across multiple reports or


embedded in a single report.

You can create a data source in the following


ways:

Specify details in the Select the Data Source


page of the Report Wizard.

Add a data source to the report in the Report Data pane.

Add a shared data source to the project in Solution Explorer.


In the Report Data pane, you can also convert an embedded data source into a shared data source.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 9-9

Working with Datasets


After you have defined one or more data sources
for your report, you must define the datasets that
contain the fields to be displayed. A dataset:

Defines a query used to retrieve data from a


data source.

Determines the fields available for use in the


report.

Can be shared across multiple reports or


embedded in a single report.

You can create one or more datasets for each data


source in the following ways:

Specify details for the data source in the Design the Query page of the Report Wizard.

Add a dataset to the report in the Report Data pane.

Add a shared dataset to the project in Solution Explorer.

In the Report Data pane, you can also convert an embedded dataset into a shared dataset.

Dataset Guidelines
When creating datasets, consider the following guidelines:

Include only columns required for display, sorting, or grouping in the report. For example,
avoid using a SELECT * statement, and instead list only the columns that are actually required.

Include only the rows required for the detail level of the report. For example, if you are creating
a report to show sales performance for each product, grouped by category, there is no need to
include the individual order rows. Instead, use a query with a GROUP BY clause that aggregates the
orders by product and category. Product-level aggregation in the query is then the detail level of the
report.
Sort data in the query. Although you can sort data in a data region in a report, it is generally more
efficient to sort it in the query using an ORDER BY clause.

Use views instead of base tables.

Tablix Data Regions


After defining a data source and a dataset, you
can use a tablix data region to display data field
values in a report. The term tablix is derived from
the words table and matrix. A tablix data
region provides a common user interface element
that supports both kinds of report layout.

You can use the following types of tablix data


region in a report:

Table. A table data region shows data as rows


in a table with a column for each field. For
example, you could use a table to show sales
MCT USE ONLY. STUDENT USE PROHIBITED
9-10 Implementing Reports with SQL Server Reporting Services

results aggregated by country, with a row for each country.

Matrix. A matrix data region shows data fields grouped in rows and columns, with aggregated values
in the intersecting cells, similar to a PivotTable or crosstab. For example, you could use a matrix to
show sales for each country for a range of years, with a row for each country and column for each
year.

List. A list data region shows a repeated arrangement of data items in a free-form layout. For
example, you could use a list to show details for each store in the organization as a store name field
formatted as a heading, with labels and field values for business type, city, and phone number.

When you create a report with the Report Wizard, a table or matrix data region is automatically
generated, depending on the type of report you selected in the wizard. To add a tablix data region to an
existing report, drag the appropriate data region item from the toolbox and position it in the report body.

After adding a tablix data region to the report, you can specify the data it should display by dragging
fields from the Report Data pane into the appropriate space in the data region. A tablix data region can
only be bound to a single dataset; the first field you add creates this binding. Attempting to drag fields
from a second dataset into a data region results in an error.

Note: You can use lookup functions to include data from multiple related datasets in a
single data region.

Demonstration: Creating a Report


In this demonstration, you will see how to:
Use the Report Wizard to create a report.

Format and preview a report in Report Designer.

Demonstration Steps
Use the Report Wizard to Create a Report

1. Ensure that both 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are running, and then log
on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. In the D:\Demofiles\Mod09 folder, run Setup.cmd as Administrator.

3. Start Visual Studio, and create a new project using the Report Server Project Wizard template.
Name the project Reports Demo and save in the D:\Demofiles\Mod09 folder.

4. In the Welcome to the Report Wizard page, click Next.

5. In the Select the Data Source page, create a new data source named AdventureWorksDW that
uses Windows authentication to connect to the AdventureWorksDW database on the localhost
instance of SQL Server. Do not select the Make this a shared data source checkbox, and click Next.

6. On the Design the Query page, click Query Builder, and in the Query Designer window, perform
the following steps and click OK:

o Add the DimGeography, DimReseller, and FactResellerSales tables.

o Select the EnglishCountryRegionName, StateProvinceName, and City columns in the


DimGeography table.

o Select the ResellerName column in the DimReseller table.


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 9-11

o Select the SalesOrderNumber, OrderDate, and SalesAmount columns in the FactResellerSales


table.

o Add the following values to the Alias column:


EnglishCountryRegionName: Country-Region
StateProvincename: State
ResellerName: Reseller
7. On the Design the Query page, click Next.

8. On the Select the Report Type page, ensure that Tabular is selected, and then click Next.
9. On the Design the Table page, add all fields to the Details section. Then click Next.

10. On the Choose the Table Style page, preview each of the built-in styles, and then select Generic and
click Next.
11. On the Choose the Deployment Location page, review the default selections, and then click Next.

12. On the Completing the Wizard page, change the report name to Reseller Sales and click Finish.

Format and Preview a Report

1. When the report has been created, click the report design surface and, in the View menu, click
Report Data.

2. In the Report Data pane, expand Data Sources and note that the AdventureWorksDW data source
you defined in the wizard has been created.

3. Right-click the AdventureWorksDW data source and click Convert to Shared Data Source. Then
note that the data source is added to the Shared Data Sources folder in Solution Explorer and can
be used by other reports.

4. Expand Datasets and note that a dataset named Dataset1 has been created from the query you
defined in the wizard.
5. Right-click Dataset1 and click Dataset Properties. Then change the name to ResellerSales and click
OK.

6. Click the Preview tab and view the report with its default formatting. If a command window opens,
minimize it. Then click the Design tab.

7. In the report body, click the text box at the top left containing the report title (Reseller Sales), and
then use the formatting buttons on the toolbar to make the title bold in 14pt.

8. Click the tablix data region so that the gray row and column headers appear. Click the gray box
where the row and column headers intersect to select the data region. Then drag the multidirectional
arrow handle to move the data region down to make room for a larger title.

9. Click the title text box and resize it so all the text is visible.

10. In the tablix data region, click the top-left cell (which contains the Country Region column title) then
hold Shift and click the bottom-right cell (which contains the [SalesAmount] field) to select all cells
in the data region. Then click the Align Left button on the toolbar.

11. Click outside the tablix data region to de-select it, and then select the first row of the tablix data
region (which contains the column titles) and, on the toolbar, click the Bold button. Then click the
Background Color button and set the background to Light Gray.

12. Right-click the cell containing the [OrderDate] field and click Textbox Properties. In the Text Box
Properties dialog box, on the Number tab, select the Date category and the 31-Jan-00 type and
click OK.
MCT USE ONLY. STUDENT USE PROHIBITED
9-12 Implementing Reports with SQL Server Reporting Services

13. Click the cell containing the [SalesAmount] field and press F4. In the Properties pane, set the
Format property to $###,##0.00.

14. Click the tablix region so that the gray row and column headers appear, and then click and drag the
border between the column headings to widen the columns so that the data they contain fits on a
single line. You may need to switch back and forth between Design and Preview tabs to determine
appropriate column widths.

15. Click the Preview tab to view the completed report.

16. Click the Export button, and then click Excel. When prompted, save the report as Reseller Sales.xlsx
in the D:\Demofles\Mod09 folder.

17. Close Visual Studio, and then start Excel and open the Reseller Sales.xlsx Excel workbook in the
D:\Demofles\Mod09 folder to view the exported report.

18. Close Excel.

Considerations for Data Model Sources


Most of the examples considered in this module
are based on data retrieved from a data
warehouse database. However, in many BI
solutions, reporting is performed using data
retrieved from a multidimensional or tabular data
model in a SQL Server Analysis Services database.
Reporting Services supports the use of Analysis
Services as a data source and the creation of
datasets based on MDX queries.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 9-13

When using Analysis Services data models as a source for reports, consider the following guidelines:

Edit MDX queries in a suitable query tool. Report Builder and Report Designer include a graphical
query editor you can use to create simple MDX queries. In many cases you will want to modify the
MDX that is generated to include additional metadata or optimize the query syntax. You should use
the query designer in Report Builder and Report Designer to create the initial query and configure
parameters, and then copy the code to a fully-featured editing tool to refine it before importing it
back into the dataset.

Remove the NON EMPTY clause. A report often includes empty rows, for example, to indicate a lack
of sales of a particular product or for a particular day. To ensure that empty rows are included in the
dataset, remove the NON EMPTY clause from the MDX query.

Let the cube perform aggregation. Instead of using a specific function such as Sum or Count in
report field expressions, use Aggregate. This ensures that the specific aggregation defined in the
cube is applied. This is particularly important when the cube includes semi-additive measures that can
be aggregated across some dimensions but not others.

Let the cube perform sorting. Dimension attributes in a cube can be configured with a sort order
based on their name, key value, or a completely different attribute in the same dimension. Avoid
specifying sort expressions for groups in a report based on a data model. Rely on the sort order
defined for the data models dimension attribute.

Demonstration: Creating a Report from a Data Model


In this demonstration, you will see how to:

Create a Data Source for Analysis Services.

Create a Report from a Data Model.

Modify a Data Model Report.

Demonstration Steps
Create a Data Source for Analysis Services
1. Ensure that you have completed the previous demonstration.

2. Start Visual Studio and open the Report Demo.sln solution you created in the previous
demonstration.

3. In Solution Explorer, right-click Shared Data Sources and click Add New Data Source.

4. In the Shared Data Source Properties dialog box, change the Name property to SSAS Demo,
change the Type property to Microsoft SQL Server Analysis Services, and click Edit.

5. In the Connection Properties dialog box, in the Server name box, type localhost\sql2. In the
Select or enter a database name list, select DemoDB, and click OK.

6. In the Shared Data Source Properties dialog box, click OK, and verify that the SSAS Demo.rds data
source is created in the Shared Data Sources folder in Solution Explorer.

Create a Report from a Data Model

1. In Solution Explorer, right-click Reports and click Add New Report.

2. On the Welcome to the Report Wizard page, click Next.

3. On the Select the Data Source page, select the SSAS Demo shared data source, and click Next.

4. On the Design the Query page, click Query Builder.


MCT USE ONLY. STUDENT USE PROHIBITED
9-14 Implementing Reports with SQL Server Reporting Services

5. In the Query Designer dialog box, expand Measures and Internet Sales, and drag Revenue to the
query results area.

6. Expand Product Category and drag EnglishProductCategoryName to the left of the Revenue
column in the results area. The results show revenue for each category.

7. Expand Geography and drag EnglishCountryRegionName to the left of the


EnglishProductCategoryName column in the results area. The results show revenue for each
category in every country or region.

8. Expand Order Date and drag Month to the left of the EnglishCountryRegionName column in the
results area. Then drag CalendarYear to the left of the Month column in the results area. The results
show revenue for the separate categories in every country or region, for each month of each year.
Note that there were no accessories sold in 2005, so rows for accessory sales in that year are not
included.

9. In the Query Designer dialog box, click the Design Mode icon on the toolbar and view the MDX for
the query.
10. In the MDX code, remove the two instances of the NON EMPTY keywords in the first line of the
query, and then click the Execute Query button on the toolbar. Note that the results now include
rows for years and months where no sales were made.
11. In the Query Designer dialog box, click OK. On the Design the Query page, click Next.

12. On the Select the Report Type page, select Matrix and click Next.

13. On the Design the Matrix page, move the available fields to the following displayed fields, and then
click Next.

o CalendarYear: Page

o Month: Columns
o EnglishCountryRegionName: Rows

o EnglishProductCategoryName: Rows

o Revenue: Details
14. On the Choose the Matrix Style page, select Corporate, and then click Next.

15. On the Completing the Wizard page, change the report name to Monthly Revenue By Category
and click Finish.

Modify a Data Model Report

1. Click the Preview tab to view the report. If a command window opens, minimize it.

2. Note that the months are sorted alphabetically. Then click the Design tab.
3. Click the matrix in the report, and then, in the Column Groups pane under the report, click the
matrix 1_Month drop-down list and click Group Properties.

4. In the Group Properties dialog box, on the Sorting page, click the Sort by list box to select the row,
and then click Delete and click OK. This prevents the report renderer from sorting the data and
instead relies on the sort order defined in the Analysis Services data model.

5. In the report, right-click the Sum(Revenue) cell and click Expression.


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 9-15

6. In the Expression dialog box, note that the wizard has aggregated the Revenue measure using a
Sum function. In this report, this aggregation produces the correct result. In general, however, it is
best to let Analysis Services perform the aggregation defined in the data model rather than use a
function in Reporting Services.

7. In the Expression dialog box, change the expression to the following code and click OK.

=Aggregate(Fields!Revenue.Value)

8. On the File menu, click Save All.

9. Click the Preview tab to view the modified report. When the report is rendered, click the Refresh
button in the report viewer to refresh the data.

10. View the data in the first page (you may need to scroll to the right to see sales revenue in 2005) and
use the Next Page button to view sales revenue for each year.

11. When you have finished, click the Design tab and minimize Visual Studio. You will return to it in the
next demonstration.
MCT USE ONLY. STUDENT USE PROHIBITED
9-16 Implementing Reports with SQL Server Reporting Services

Lesson 3
Grouping and Aggregating Data in a Report
While many reports show detailed business data, it is very common for some reports to show aggregated
data that is grouped by time periods or business entities. For example, a sales manager might require a
report showing total sales grouped by salesperson and month.

Reporting Services reports can include multiple data groupings and aggregated values, and even
interactive drill-down functionality enabling consumers to show or hide different levels of detail in a
grouped report.

Lesson Objectives
This lesson describes how to implement grouping in a report and control how grouped data is rendered.
After completing this lesson, you will be able to:

Add groups to a tablix data region.

Display aggregated values based on groupings in a report.

Enable interactive drill-down functionality for data groupings in a report.

Configure page breaks for groups in a report.

Adding Groups to a Tablix Data Region


You can group data in a tablix data region to
make it easier for users to see a breakdown of
information by time periods or business entities. A
data region can contain multiple groups, each
with multiple levels. For example, a report showing
sales might group them by product category,
subcategory, and individual product.

The Groupings pane in Report Designer shows the


groups defined in a data region. Table data
regions can have row groups, and matrix regions
can have both row and column groups. All table
and matrix data regions include a detail-level
group, which includes all data fields by default.

You can use the following techniques to create groups:

Drag the field on which you want to group data from the Report Data pane to the Groupings pane.
Place it above an existing group to create a parent group, or below an existing group to create a child
group.

Right-click an existing group in the Groupings pane, click an option on the Add Group context
menu, and specify the field in the bound dataset you want to group by. Using this technique enables
you to add a parent, child, or adjacent group before or after the existing one you right-clicked.

Right-click a cell in the table or matrix, click an option on the Add Group context menu, and specify
the field in the bound dataset you want to group by. Using this technique enables you to add a
parent, child, or adjacent group before or after the group containing the cell you right-clicked.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 9-17

Note: In most scenarios, report data is grouped by individual fields in the dataset. However,
you can also use expressions to define groupings that include multiple fields or other criteria.

After defining a group, you can select it in the Groupings pane and edit its properties. In particular, you
should specify an appropriate name for the group because it can be used in expressions to specify scope
for aggregated values.

Displaying Aggregated Values


One of the benefits of grouping data in a report is
that you can use the groupings to show
aggregated values for the data. For example, a
report showing sales grouped by product
category, subcategory, and individual products
can show sales totals for each of those sections.

You can use the following techniques to add an


aggregation to a report:

Right-click a group in the Groupings pane,


click Add Total, and click Before or After,
depending on where you want to display the
total. This technique adds a row (or column if
you clicked a column group for a matrix data region) containing a Sum function for each numeric
field in the group.

Add a row or column to the group and enter an expression that includes an aggregation function
such as Sum, Avg, Min, Max, or Count. By default, the function aggregates data at the same level as
the group within which it is used. For example, you could use the following expression in a Product
group to show the total sales for each product:

=Sum(Fields!SalesAmount.Value)

In any group level, use an aggregation function with a named scope to aggregate a field value at
another group level. For example, you could use the following expression in a Subcategory group to
show the total sales for each subcategory and how that compares with the total sales for the
Product_Category parent group:

=Sum(Fields!SalesAmount.Value) & " of " & Sum(Fields!SalesAmount.Value, "Product_Category")


MCT USE ONLY. STUDENT USE PROHIBITED
9-18 Implementing Reports with SQL Server Reporting Services

Enabling Interactive Drill-down Functionality


Groups and aggregations enable you to create
reports that include summary and detail levels of
information. In many business scenarios, report
users initially want to view summary information,
and only explore details for specific groupings
where a potential issue or anomaly merits further
investigation. Reporting Services supports this
drill-down functionality, enabling you to hide or
show groups in a report by clicking a toggle
button associated with a field in the parent group.

You can use the following techniques to enable


drill-down functionality in a report:

In the Report Wizard, select Enable Drilldown in the Choose the Table Layout page for a tabular
report containing groups.

In an existing report containing groups, right-click a child group in the Groupings pane and click
Group Properties. In the Group Properties dialog box, on the Visibility tab, select Hide, select
Display can be toggled by this report item, and select the field in the parent group that you want
viewers to use to show or hide the group.

Considerations for Output Format


Drill-down functionality is only available when a published report is viewed in a browser or exported to a
format such as Microsoft Excel where the renderer supports this kind of interactivity. When exported to
a format that does not support interactivity, the data is rendered statically in the state in which it is
currently displayed in the browser. To overcome this problem, you can create a version of the same report
for each target format, for example, an interactive report that includes a hyperlink to a static version for
printing. This can entail a lot of additional work to develop and manage multiple versions of each report.
An alternative is to design adaptive reports that modify their behavior, depending on the rendering
extension being used. To help you accomplish this, Reporting Services supports the following global
variables:

RenderFormat.IsInteractive. You can use this variable to determine if the render format supports
interactivity, such as drill-down functionality to show hidden groups.

RenderFormat.Name. You can use this variable to determine the specific rendering extension being
used and apply format-specific settings.

For example, the following expression could be used to set the Hidden property of a group in a Tablix
data region. The group would then be hidden when rendered to an interactive format, but visible in
formats that do not support interactivity:

=iif(Globals!RenderFormat.IsInteractive, True, False)


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 9-19

Configuring Page Breaks for Groups


You can view reports online in a browser but, in
many cases, they are printed or exported to
another format for offline viewing and further
analysis. Reporting Services provides a number of
ways to control how reports are divided into pages
when they are printed or rendered in a format that
supports pagination. Pagination is particularly
important when working with groups, for example,
enabling you to start a fresh page for each new
instance of a group.

Key ways in which you can control pagination


include:

Defining page breaks in tablix data regions, groups, and rectangles. You can define explicit behaviors
for page breaks by setting the BreakLocation, ResetPageNumber, and PageName properties in the
PageBreak category.
o The BreakLocation property specifies where the page break should occur. For data regions and
rectangles, this can be before, after, or before and after the report element on which the page
break is defined. For groups, you can also specify that the page break should occur between
instances. For example, you can force a new page for each employee in a report that shows sales
grouped by employee.

o The ResetPageNumber property causes page numbering to be restarted on the new page that
the page break generates.

o The PageName property specifies a name for the new page that the page break generates.
Some renderers, such as Microsoft Excel, use page names to identify pages. You can set an
explicit value for this property, or you can use an expression to generate the value dynamically.
This can be used, for example, to specify the name of the employee in the grouped sales report
described in the previous topic.

o You can also set the Disabled property to disable a page break.

Setting the InitialPageName property of the report. This property defines the page name for the first
page in the report (or for all pages if it contains no explicitly-defined page breaks).

Pagination and Microsoft Excel


The ability to control pagination is particularly useful when you export reports to Microsoft Excel. By
default, the Excel rendering extension creates a workbook with a single worksheet containing the report
data. The worksheet name, which is shown in the tabs at the bottom of the workbook, is based on the
name of the report by default. However, if you specify an InitialPageName property value, it is used for
the worksheet name.

In many cases, you may want to render the report data on multiple worksheets in Excel. One reason for
doing this is to make it easier for users to navigate the report data, for example, by creating a new
worksheet (and corresponding tab) for each employee grouping in the sales report described in the
previous topic. Another reason is that Excel enforces a maximum number of rows for each worksheet. A
very large report may not render successfully if it contains more rows than are available in a worksheet.

Setting a page break in a report causes the Excel rendering extension to generate a new worksheet. If the
page break includes a PageName property value, the worksheet is named accordingly. This enables you
to design multiple-worksheet Excel-based reports that are easy to navigate.
MCT USE ONLY. STUDENT USE PROHIBITED
9-20 Implementing Reports with SQL Server Reporting Services

Demonstration: Grouping Data in a Report


In this demonstration, you will see how to:

Create groups in a Tablix data region.

Display aggregate values for data groups.


Enable drill-down interactivity.

Configure report pagination based on data groups.

Demonstration Steps
Create Groups in a Tablix Data Region

1. Ensure that you have completed the previous demonstrations in this module.

2. Maximize the Report Demo solution in Visual Studio. In Solution Explorer, double-click the Reseller
Sales.rdl report.

3. In the Row Groups pane, in the (table1_Details_Group) drop-down list, point to Add Group and
click Parent Group. In the Tablix Group dialog box, in the Group By list, click [Country_Region],
select Add group header, and click OK.

4. Note that the report now contains two Country Region columns one for the group you just
created, and another for the original field, which is no longer required. Right-click the gray column
header for the original Country Region field (the second column) and click Delete Columns to
remove it.

5. On the View menu, click Report Data. In the Report Data pane, expand Datasets, and drag the
State field from the ResellerSales dataset into the Row Groups pane, between the Country_Region
and (table1_Details_Group) groups.

6. Note that the previous steps added a group for the State field without a group header in addition to
the original State field. Right-click the gray column header for the original State field (the third
column) and click Delete Columns to remove it.

7. Right-click the [State] field in the second column, point to Insert Row, and click Inside Group
Above. This creates a header row for the group.

8. Right-click the [City] field in the third column, point to Add Group, and under Row Group, click
Parent Group. In the Tablix Group dialog box, in the Group By list, click [City], select Add group
header, and click OK.

9. Right-click the gray column header for the original City field (now the fourth column) and click
Delete Columns to remove it.
10. Right-click the [City] field in the third column, point to Add Group, and click Child Group. In the
Tablix Group dialog box, in the Group By list, click [Reseller], select Add group header, and click
OK.
11. Right-click the gray column header for the original Reseller field (the fifth column) and click Delete
Columns to remove it.

12. In the Row Groups pane, in the (table1_Details_Group) drop-down list, click Group Properties. On
the Sorting tab, click Add and, in the Sort by column, click [OrderDate]. Then click OK.

13. Preview the report, and switch back to the Design tab to widen the new columns as required. When
you are satisfied with the column widths, preview the report and note that the data is now grouped
and sorted.

Display Aggregate Values for Data Groups


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 9-21

1. On the Design tab, in the Row Groups pane, on the Country_Region drop-down list, point to Add
Total and click After. Note that this adds a row at the bottom of the table with an expression to
calculate the sum of the [SalesAmount] field.

2. In the Row Groups pane, on the State drop-down list, point to Add Total and click Before. Note
that this adds an expression to calculate the sum of the [SalesAmount] field to the first row in the
table (in which the [Country_Region] field is displayed in the first column).

3. Repeat the previous step for the City, Reseller, and (table1_Details_Group) groups, and note that
the Sales Amount column now contains subtotals for each grouping with a grand total at the
bottom.

4. Note that when you added the totals, Report Designer automatically created a Total label for each
row. Delete all these Total labels other than the one on the bottom row of the table.
5. Right-click the cell immediately above the [Sales Order Number] field in the Sales Order Number
column, and click Expression. In the Expression dialog box, enter the following expression and click
OK:

=CountDistinct(Fields!SalesOrderNumber.Value)

6. Right-click the cell containing the expression you just created, click Copy, and then paste the cell into
each of the empty cells above it.

7. Preview the report, and note that the number of orders and sales total for each group is displayed.
Enable Drill-down Interactivity

1. On the Design tab, in the Row Groups pane, in the (table1_Details_Group) drop-down list, click
Group Properties.
2. In the Group Properties dialog box, on the Visibility tab, select the Hide option, select the Display
can be toggled by this report item checkbox, and in the drop-down list, select Reseller1. Then click
OK. Reseller1 is the text box containing the name of the reseller for each Reseller group.

3. Repeat the previous two steps to hide the following groups and enable them to be toggled by the
specified report items:

o Reseller (toggled by City1)

o City (toggled by State1)

o State (toggled by Country_Region1)

4. Preview the report and note that you can expand and contract the groupings to display the level of
detail you require.

Configure Pagination Based on Data Groups

1. On the Design tab, click the report design surface under the tablix and press F4. In the Properties
pane, set the InitialPageName property for the Report item to Reseller Sales Report.

2. Click the tablix data region so that the gray row and column headers appear, and click the gray box
where the row and column headers intersect at the top left to select the data region. In the
Properties pane, expand the PageBreak property group for the table1 tablix and set the
BreakLocation property to Start. This creates a page break at the start of the tablix data region.
MCT USE ONLY. STUDENT USE PROHIBITED
9-22 Implementing Reports with SQL Server Reporting Services

3. In the Row Groups pane, click the Country_Region group. In the Properties pane, expand the
Group and PageBreak property groups for the Country_Region tablix member and set the
BreakLocation property to Between. This creates a page break between each instance of the
Country_Region group.

4. With the Country_Region group still selected, in the Properties pane, set the PageName property
to the following expression:

=Fields!Country_Region.Value

5. Preview the report and note that the first page contains only the report title.

6. Use the page navigation buttons to scroll through the report and verify that each country group
starts on a new page.

7. Click the Export button, and then click Excel. When prompted, save the report as Reseller Sales.xlsx
in the D:\Demofles\Mod09 folderreplacing the file if it already exists.
8. Minimize Visual Studio, and then open the Reseller Sales.xlsx Excel workbook in the
D:\Demofles\Mod09 folder and view the exported report, noting that it contains a title worksheet and
a worksheet for each country, in which users can expand or collapse the grouped data. Then close
Excel without saving the workbook.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 9-23

Lesson 4
Publishing and Viewing a Report
After creating your reports, you can publish them to a report server for viewing by users. Professional
report developers use Report Designer in Visual Studio to publish reports at the project level or they can
publish individual reports. Before publishing your reports, you must be aware of how they are affected by
the project properties.

Lesson Objectives
After completing this lesson, you will be able to:

Set Report Server project properties and publish reports.

View a published report.

Reporting Services Project Properties


Before you publish reports created in Visual
Studio, you must ensure that the following
properties are set appropriately for the report
server where you intend to publish them:

Property Description

OverwriteDataSources Specifies whether to replace existing data sources when


publishing shared data sources. If this property is set to False, an
attempt to publish a data source with the same name as an
existing one will fail.

TargetDataSetFolder Folder where shared datasets are published. When publishing to


a report server in native mode, this property should be set to the
folder name. When publishing to a report server in SharePoint
integrated mode, this property should be set to the full URL for
the document library in the SharePoint site.

TargetDataSourceFolder Folder where shared data sources are published. When


publishing to a report server in native mode, this property
should be set to the folder name. When publishing to a report
server in SharePoint integrated mode, this property should be
set to the full URL for the document library in the SharePoint
site.
MCT USE ONLY. STUDENT USE PROHIBITED
9-24 Implementing Reports with SQL Server Reporting Services

Property Description

TargetReportFolder Folder where reports are published. When publishing to a report


server in native mode, this property should be set to the folder
name. When publishing to a report server in SharePoint
integrated mode, this property should be set to the full URL for
the document library in the SharePoint site.

TargetReportPartFolder Folder where report parts are published. When publishing to a


report server in native mode, this property should be set to the
folder name. When publishing to a report server in SharePoint
integrated mode, this property should be set to the full URL for
the document library in the SharePoint site.

TargetServerURL Service endpoint for the report server. When publishing to a


report server in native mode, this property should be set to URL
for the report server web service. When publishing to a report
server in SharePoint integrated mode, this property should be
set to the URL for the SharePoint site where you want to publish
the reports.

TargetServerVersion Compatibility version for reports. The RDL format used by


Reporting Services in SQL Server 2008 R2 and later is different
from the format used in earlier versions. If you are publishing to
an older report server, you must set the TargetServerVersion
appropriately. You can publish reports in either version to report
servers running SQL Server 2008 R2 or later, but some features
are not supported in the older RDL format.

After you have set the project properties, you can publish the reports, shared data sources, and shared
datasets in the project. To publish all items in the project, right-click the project in Solution Explorer and
click Deploy, or on the Build menu, click Deploy Project_name. To publish an individual item in the
project, right-click the item in Solution Explorer and click Deploy.

When you deploy a project or an individual item, the deployment status is shown in the Output pane.

Viewing a Report
After a report has been deployed, users can view it
by performing the following steps:

1. Open a web browser and navigate to the


report location.

2. If the report has been published to a report


server running in native mode, the user must
navigate to the Report Manager site, and if
necessary enter appropriate authentication
credentials. If the report has been published
to a report server in SharePoint integrated
mode, the user must navigate to the
document library in the SharePoint site where
the reports have been published.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 9-25

3. Click the report. If the report has no parameters, or all parameters have default values, it will be
rendered in the browser.

4. Enter parameter values to filter the report as required, and click View Report to render it with the
specified parameters.

5. Use the Report Viewer toolbar to navigate the report by scrolling through pages or searching for a
specified text value.

6. Export the report to a file or click the feed icon to download the report as an atom feed.

Demonstration: Deploying Reports


In this demonstration, you will see how to:

Configure a report project.


Deploy a report project.

View a published report.

Demonstration Steps
Configure a Report Project for Deployment

1. Ensure that you have completed the previous demonstrations in this module and maximize the
Report Demo solution in Visual Studio.
2. In Solution Explorer, right-click the Report Demo project and click Properties. Then set the following
properties and click OK:

o TargetDatasetFolder: http://localhost/sites/adventureworks/Reports/Demo/Datasets
o TargetDataSourceFolder: http://localhost/sites/adventureworks/Reports/Demo/Data Sources

o TargetReportFolder: http://localhost/sites/adventureworks/Reports/Demo

o TargetReportPartFolder: http://localhost/sites/adventureworks/Reports/Demo/Report Parts


o TargetServerURL: http://localhost/sites/adventureworks

Deploy a Report Project

1. On the Build menu, click Deploy Report Demo.

2. Observe the deployment progress in the status bar and the Output pane.

View Reports in SharePoint Server

1. When deployment has succeeded, close Visual Studio, start Internet Explorer, and browse to the
SharePoint site at http://localhost/sites/adventureworks. This site may take a few minutes to open
the first time you browse to it.

2. In the Quick Launch area on the left, click Reports. Then in the Reports document library, click the
Demo folder.

3. Click the Monthly Revenue By Category report and note that it is rendered in the SharePoint
interface.

4. In the Actions menu, point to Export, and click Excel. When prompted, save the report as Monthly
Revenue By Category.xlsx in the D:\Demofles\Mod09 folder.

5. When the file has downloaded, click Open to view it in Excel.


MCT USE ONLY. STUDENT USE PROHIBITED
9-26 Implementing Reports with SQL Server Reporting Services

6. Close Excel and Internet Explorer.


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 9-27

Lab: Creating a Report with Report Designer


Scenario
The sales manager at Adventure Works Cycles currently spends a large amount of time manually creating
sales reports in Excel. The report shows monthly sales broken down by product category, subcategory,
and individual product, but takes too long to produce each month. The sales manager has requested a
solution that generates the required report on-demand, and exported in Excel format.

To accomplish this, you will create a report, and enhance it to group and aggregate data, support
interactive drill-down, and generate a separate worksheet per month when exported to Excel.

Objectives
After completing this lab, you will be able to:

Create a report.

Group and aggregate data in a report.

Publish a report.
Estimated Time: 45 minutes

Virtual machine: 20466C-MIA-SQL

User name: ADVENTUREWORKS\Student

Password: Pa$$w0rd

Exercise 1: Creating a Report


Scenario
The sales manager at Adventure Works spends a large amount of time each month using Excel to query
the data warehouse to produce a report that shows monthly sales broken down by product category,
subcategory, and individual product. You have been asked to create a reporting solution that provides the
sales manager with the required information.

The main tasks for this exercise are as follows:


1. Prepare the Lab Environment

2. Create a Report Project

3. Modify Report Properties


4. Format the Report

Task 1: Prepare the Lab Environment


1. Ensure the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then log
on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. Run Setup.cmd in the D:\Labfiles\Lab09\Starter folder as Administrator.

Task 2: Create a Report Project


1. Start Visual Studio and create a new project named AWReports in the D:\Labfiles\Lab09\Starter
folder. Use the Report Server Project Wizard template.
MCT USE ONLY. STUDENT USE PROHIBITED
9-28 Implementing Reports with SQL Server Reporting Services

2. Specify the following settings in the wizard:

o Create a new shared data source named AdventureWorksDW that connects to the
AdventureWorksDW database on the MIA-SQL instance of SQL Server by using Windows
authentication.

o Use the following query to retrieve the data for the report (You can import this query from the
D:\Labfiles\Lab09\Starter\Sales Query.sql):

SELECT d.CalendarYear [Year],


d.MonthNumberOfYear [MonthNo],
d.EnglishMonthName [Month],
c.EnglishProductCategoryName [ProductCategory],
s.EnglishProductSubcategoryName [ProductSubcategory],
p.EnglishProductName [Product],
i.SalesOrderNumber,
i.OrderDate,
i.SalesAmount
FROM dbo.DimProductCategory c
INNER JOIN dbo.DimProductSubcategory s ON s.ProductCategoryKey = c.ProductCategoryKey
INNER JOIN dbo.DimProduct p ON p.ProductSubcategoryKey = s.ProductSubcategoryKey
INNER JOIN dbo.FactInternetSales i ON i.ProductKey = p.ProductKey
INNER JOIN dbo.DimDate d ON i.OrderDateKey = d.DateKey
ORDER BY d.CalendarYear, d.MonthNumberOfYear, c.EnglishProductCategoryName,
s.EnglishProductSubcategoryName, p.EnglishProductName, i.SalesOrderNumber;

o Create a tabular report that includes the Year, Month, ProductCategory, ProductSubcategory,
Product, SalesOrderNumber, OrderDate, and SalesAmount fields in the details section with no
groupings.

o Apply the generic report style.

o Use the default deployment location for the time being.

o Name the report Internet Sales.

Task 3: Modify Report Properties


1. In the Report Data pane, change the name of the Dataset1 dataset to InternetSales.

2. In the Row Groups pane, change the name of the (table1_Details_Group) to Sales_Details.

Task 4: Format the Report


1. Preview the report to view the default formatting.

2. Format the report to improve its visual style and legibility. Use the following formatting techniques as
required:

o To format text, select the text or text box in which it is displayed and use the formatting buttons
on the toolbar.

o To change column widths, drag the edges of the column headers.

o To apply a number or date format, right-click the text box containing the data field and click
Properties. Then on the Number tab, select an appropriate format.

Results: After this exercise, you should have a report that shows sales data from the
AdventureWorksDW database.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 9-29

Exercise 2: Grouping and Aggregating Data


Scenario
The sales manager has reviewed the report you created and asked you to modify it to show each month
on a separate page, with sales for that month grouped by product category, product subcategory, and
individual product.

The main tasks for this exercise are as follows:

1. Delete Columns and Rows

2. Group and Sort the Report Data

3. Add Aggregate Summary Values

4. Enable Drill-down Interactions


5. Add Page Breaks

Task 1: Delete Columns and Rows


1. Delete the Year column.

2. Delete the row containing the column titles.

Task 2: Group and Sort the Report Data


1. Add a parent group to the Sales_Details group. The new group should group the data on the Month
field and include a group header.

2. Delete the column that displays the Month field in the details group. This column is no longer
required because the month is now displayed in the parent group header.

3. Edit the properties of the new Month group so that it is sorted by the MonthNo field. This ensures
that the data is displayed in the correct month order (January, February, and so on.)
4. Add parent groups with group headers to the Sales_Details group for the ProductCategory field.

5. Add parent groups with group headers to the Sales_Details group for the ProductSubcategory
field.

6. Add parent groups with group headers to the Sales_Details group for the Product field.

7. Delete the details columns for the fields in the report that now have their own groups.

8. Adjust column widths as necessary so that the values in the new columns can be read easily.

Task 3: Add Aggregate Summary Values


1. Add totals rows before the Sales_Details, Product, ProductSubcategory, and ProductCategory
groups.

2. Add the following expression in the Product group heading row immediately above the
SalesOrderNumber field:

=Count(Fields!SalesOrderNumber.Value)

3. Copy the cell containing the expression to the ProductSubcategory, ProductCategory, and Month
rows.
MCT USE ONLY. STUDENT USE PROHIBITED
9-30 Implementing Reports with SQL Server Reporting Services

Task 4: Enable Drill-down Interactions


1. Edit the visibility properties of the Sales_Details group so that it is hidden by default and its visibility
can be toggled by the Product1 text box (the textbox containing the Product value in the Product
group heading).

2. Modify the Product group so that its visibility can be toggled by the ProductSubcategory1 text box.

3. Modify the ProductSubcategory group so that its visibility can be toggled by the
ProductCategory1 text box. Verify that, when previewed, your report provides drill-down
functionality.

Task 5: Add Page Breaks


1. Set the InitialPageName property of the report to Sales Summary.

2. Create a page break between each instance of the Month group, and set the PageName property
for the Month group to the following expression:

=Fields!Month.Value

3. Add a page break at the start of the table1 Tablix object so that the table starts on a new page,
leaving a blank page with only the report title at the beginning.

4. Preview the report to verify that it is paginated correctly.

Results: After this exercise, you should have a report that includes sales data grouped by month, product
category, subcategory, and product.

Exercise 3: Publishing a Report


Scenario
You have created a report for the sales manager, and must now deploy it to the report server, which is
installed in SharePoint Integrated mode.

The main tasks for this exercise are as follows:

1. Deploy Report Items

2. View a Published Report

Task 1: Deploy Report Items


1. Set the following properties for the AWReports project:

o OverwriteDatasets: True

o OverwriteDataSources: True
o TargetDatasetFolder: http://mia-sql/sites/adventureworks/Reports/Datasets

o TargetDataSourceFolder: http://mia-sql/sites/adventureworks/Reports/Data Sources

o TargetReportFolder: http://mia-sql/sites/adventureworks/Reports

o TargetReportPartFolder: http://mia-sql/sites/adventureworks/Reports/Report Parts

o TargetServerURL: http://mia-sql/sites/adventureworks

o TargetServerVersion: SQL Server 2008 R2 or later


2. Deploy the project and close Visual Studio.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 9-31

Task 2: View a Published Report


1. Use Internet Explorer to browse to the SharePoint site at http://mia-sql/sites/adventureworks.

2. In the Reports folder, view the Internet Sales report.

3. Export the report to a Microsoft Excel workbook named Internet Sales.xlsx in the
D:\Labfiles\Lab09\Starter folder.

4. Open the chart in Excel and verify that the first worksheet includes the chart you created. Then close
Internet Explorer and Excel.

Results: After this exercise, you should have configured and deployed a Report Server project.
MCT USE ONLY. STUDENT USE PROHIBITED
9-32 Implementing Reports with SQL Server Reporting Services

Module Review and Takeaways


In this module, you have learned how to use SQL Server Reporting Services to create and publish reports.

Review Question(s)
Question: What kind of formal reports exist in organizations for which you have worked?
MCT USE ONLY. STUDENT USE PROHIBITED
10-1

Module 10
Enhancing Reports with SQL Server Reporting Services
Contents:
Module Overview 10-1

Lesson 1: Showing Data Graphically 10-2

Lesson 2: Filtering Reports by Using Parameters 10-10

Lab: Enhancing a Report 10-16

Module Review and Takeaways 10-23

Module Overview
SQL Server 2014 Reporting Services supports reports that include graphical elements, such as charts,
indicators, and maps. It also supports parameters, enabling users to filter the data included in a report. In
this module, you will learn how to use these features to enhance reports.

Objectives
After completing this module, you will be able to:

Use charts and other visualizations to show data graphically in a report.


Use parameters to filter data in a report.
MCT USE ONLY. STUDENT USE PROHIBITED
10-2 Enhancing Reports with SQL Server Reporting Services

Lesson 1
Showing Data Graphically
While many reports contain only text and figures, some users prefer to view graphical representations of
business data. Reporting Services supports a number of data visualizations you can use to create graphical
reports that give users an intuitive understanding of key business metrics.

Lesson Objectives
This lesson describes various ways to add graphical elements to a report. After completing this lesson, you
will be able to:

Include images in a report.

Include charts to visualize data values in a report.

Use gauges to show key values in a report.

Summarize data graphically with data bars and sparklines.


Use indicators to show trends and Key Performance Indicators (KPIs) in a report.

Display geographical data in a report by using a map.

Including Images in a Report


You can enhance the appearance of a report by
including images. Common ways to do this
include:

Adding a logo. For example, a company


might include its logo in the header of all
reports to provide a consistent look and feel
that reflects corporate branding.

Binding an image to a data field. For


example, a report showing a sales breakdown
by product might include a photograph of
each product.
Including a background image in a report. For example, an annual report for shareholders in a
bicycle manufacturing company might include a subtle image of the corporate headquarters as a
background watermark.

To add an image to a report, drag an Image item from the Toolbox and place it in the report body. Then
use the Image Properties dialog box to configure the image. The key property to set is the image source,
which can be one of the following three kinds:
External. An image published in an external location such as a website. If you select this option, you
must specify the location of the image as a URL in the Use this image list box. Alternatively, if the
dataset used by the report includes a field containing a URL, you can select this field.

Embedded. An image that is embedded into the report. If you select this option, you can import an
image file to embed in the report.

Database. An image field in the dataset. If you choose this option, you must specify the field to bind
to the image control, and the Multipurpose Internet Mail Extension (MIME) type you want to use
when rendering the image.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 10-3

To display a background image for a report body or data region, set the following properties in the
BackgroundImage category of the Properties pane:

Source. Specify external, embedded, or database as described above.

Value. Specify the source of the image as above.

MIMEType. Specify the MIME type to be used for image data fields as described above.
BackgroundRepeat. Specify whether or not the image should be repeated to fill the background.

Working with Charts


Charts provide a common way to express data
values visually. Many reports include a chart as a
visual summary of the detailed data in a table or
matrix while others consist solely of a chart.

Use the following procedure to add a chart to a


report:

1. Drag a Chart item from the Toolbox and drop


it on the body of the report.

2. In the Select Chart Type dialog box, select an


appropriate type of chart for your data.
Available chart types include:

o Column charts

o Line charts

o Shape charts (such a pie charts)

o Bar charts
o Area charts

o Range charts

o Scatter charts

o Polar charts

3. Select the chart to display the Chart Data pane, and then specify the fields the chart will display.

4. Add one or more data fields to the Values area to specify the fields to be plotted on the value axis of
the chart. For example, add a SalesAmount field from dataset showing customer sales orders.

5. Add fields to the Category Groups area to create the data points for the chart. For example, add a
Year field from the sales orders data set to show a summarized value for each year.

6. Add fields to the Series Groups area to plot values for multiple series. For example, add a Country
field from the sales orders data set to show summarized sales values for each country.
MCT USE ONLY. STUDENT USE PROHIBITED
10-4 Enhancing Reports with SQL Server Reporting Services

7. Edit the properties of the chart area and individual elements within it, such as axis titles, the chart
legend, and the chart title. This enables you to customize the appearance of the chart to suit your
requirements. You can also change the type of a chart if you later decide that the data would be
better represented using a different visualization. For example, you might change a column chart
showing sales grouped by year to a line chart displaying a trend over time more clearly.

Demonstration: Creating a Chart


In this demonstration, you will see how to:

Add a chart to a report.

Specify chart data.

Format a chart.

Demonstration Steps
Add a Chart to a Report

1. Ensure that the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then
log on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd. Then, in
the D:\Demofiles\Mod10 folder, run Setup.cmd as Administrator.

2. Start Visual Studio and open Reports Demo.sln in the D:\Demofiles\Mod10 folder. Then in Solution
Explorer, double-click the Reseller Sales.rdl report.

3. Click the tablix data region so that the gray row and column headers appear, and click the gray box
where the row and column headers intersect at the top left to select the data region. Then drag the
multidirectional arrow handle to move the data region down about 10 centimeters (three inches).

4. If the Toolbox is not visible, on the View menu, click Toolbox, and in the Toolbox, drag a Chart to
the blank area you just created above the tablix data region. Then in the Select Chart Type dialog
box, in the Shape section, select the third chart style (3-D Pie) and click OK.

5. Move and resize the chart to fit the available space above the tablix data region.

Specify Chart Data

1. Click the chart to display the Chart Data pane.

2. In the Chart Data pane, in the Values section, add the SalesAmount field.

3. In the Chart Data pane, in the Category Groups section, add the Country_Region field.

4. Click the Preview tab and verify that the chart displays sales by country or region. If a console
window opens, minimize it.

Format a Chart
1. On the Design tab, click the Chart Title label, and press F4. Then in the Properties pane change the
Caption property to Reseller Sales by Geography.

2. Click a blank area on the report design surface, and then right-click the chart and click Chart
Properties.

3. In the Chart Properties dialog box, in the Color palette drop-down list, select any color palette, and
click OK.

4. Preview the report and view the formatted chart.


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 10-5

5. Click the Export button, and then click Excel. When prompted, save the report as Reseller Sales.xlsx
in the D:\Demofles\Mod10 folder.

6. Minimize Visual Studio, and then open the Reseller Sales.xlsx Excel workbook and view the exported
report, noting that the first worksheet contains the chart as an image. Then close Excel.

Showing Key Values with Gauges


Gauges provide an intuitive way to show KPIs in a
report. For example, you can use a gauge to show
sales performance against a sales quota. Gauges
can be radial or linear, and exist within a gauge
panel. When you add a gauge to a report, a gauge
panel is automatically created, to which you can
then add more gauges if required.

Use the following procedure to add a gauge to a


report:

1. Drag a Gauge item from the Toolbox and


place it on the report body.

2. In the Select Gauge Type dialog box, choose an appropriate gauge type for your data.

3. Select the gauge to display the Gauge Data pane, and add one or more fields to the Values section
to define the pointers that your gauge will display. For example, add the SalesAmount field from a
dataset showing sales figures for an organization.

4. Format the scale of the gauge and add one or more ranges so the pointer value is shown relative to a
comparative business measure. For example, set the scale to use a SalesQuota value as a maximum
so the needle shows current sales volume relative to the sales quota for the year. Add a range to
highlight important subsections of the scale, such as a red area for values less than the sales volume
that the company must achieve to break even.

Summarizing Data with Data Bars and Sparklines


Data bars and sparklines are compact charts
usually displayed inline with data in a table or
matrix. They provide the same type of visual
representations of data as standard charts, but are
designed to be small, so they do not support
features such as legends, axis lines, labels, or tick
marks. Report authors can use data bars and
sparklines to convey a lot of information visually in
a small amount of space. In particular, data bars
and sparklines are useful for providing
comparisons of key metrics across a number of
rows of data, enabling report consumers to see
potential issues and identify outliers at a glance.
MCT USE ONLY. STUDENT USE PROHIBITED
10-6 Enhancing Reports with SQL Server Reporting Services

Although data bars and sparklines are similar, the key differences between them are:

You can use data bars to show multiple data points, but you typically use them to show a single value,
such as an annual sales total. You can include a data bar at any level of a table or matrix.

You can use sparklines to display multiple data points that are grouped by categories, such as sales
totals for a range of months. They work with aggregated data, and you can only use them in a group
level of a table or matrix.

Use the following procedure to add a data bar or sparkline to a report:

1. Drag a Data Bar or Sparkline item from the Toolbox and place it in an appropriate cell in a table or
matrix.

2. Select the appropriate data bar or sparkline type.

o You can use sparklines to display most kinds of chart, including column, bar, line, area, pie, and
range charts. Only pie charts can be shown in three-dimensional (3-D) format.

o You can only use data bars to display column or bar charts. These can be basic (in which each bar
or column represents a single value), stacked (in which multiple values are shown as different
colored ranges in the same bar or column), or 100 percent stacked (in which the bar or column
fills the available space and multiple values are shown as proportionally sized color ranges).

3. Select the data bar or sparkline to display the Chart Data pane, and then specify the fields to display.

4. Add one or more data fields to the Values area to specify the series and fields you want to show in
the chart. You can add multiple series and fields for each series. For example, you can create a
sparkline showing an employees monthly sales totals together with their monthly sales quota.
However, you should bear in mind that the benefit of data bars and sparklines is that they are
compact. If you try to display too much data in such a small chart, you may confuse rather than
inform report users.
5. Add fields to the Category Groups area to define the grouping that defines the categories for the
chart. For data bars showing only a single value, a category group is not required. For sparklines
where you want to show the value (for example, sales total) across a range of categories (for example,
months), you need to specify the field on which you want to aggregate the data.

6. Add fields to the Series Groups area to add a further grouping of the data. For example, for a
sparkline that shows sales grouped by month, you can add a series group to show monthly sales
grouped by product. Generally, series groups are more effective in full charts than they are in data
bars and sparklines.

In addition to the data settings, you can set a wide range of properties to format the colors and style of
the data bar or sparkline.

Data Bar and Sparkline Alignment


You generally use sparklines and data bars to compare key metrics across multiple rows. It is important,
therefore, that the plotted data points and scales you use in each instance of the sparkline or data bar
provide a like-with-like comparison.

By default, the plot points and scale you use in an individual instance of a data bar or sparkline are
consistent only within the chart itself. For example, a sparkline might show that a salesperson had sales in
two months of the year and achieved substantially higher figures for one of those months than in the
other.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 10-7

However, if you compare this salesperson with another in the same report, the individual plot locations of
the months may not align. This is because the first salesperson only started working two months ago, but
the other has been employed since the beginning of the year. If the months in the sparkline are not
aligned, users cannot make a month-by-month comparison between the two salespeople. Additionally,
the scale used to show the sales volume may vary between the sparklines, so a large value for one month
in the first sparkline might not indicate the same volume of sales as a similarly-sized value in the second.

To resolve this issue, you can align the horizontal and vertical axes of the charts based on the data
grouping in which the charts are displayed. This ensures that the data is plotted on the same scale for all
instances of the sparkline or data bar in the grouping, which enables like-with-like comparisons.

Using Indicators
Indicators are minimal gauges you can use to
show a particular icon in a report. They are
commonly used to show:
Trends. For example, you can display an
upward-pointing arrow to show a month-on-
month increase in sales or a downward-
pointing arrow to show a month-on-month
reduction.

Status and conditions. For example, you can


display a green traffic light to indicate that a
salesperson is on track to meet a sales quota
or a red one to show that an individual is
behind on a quota.

Ratings. For example, you can display a star that is proportionally filled to show customer satisfaction
levels.
Indicators are a useful way to show how various aspects of the business are performing against key goals
or metrics in a form that users can see quickly.

As a report author, you can use indicators in two ways by choosing the measurement unit on which you
want to base the indicator:

Numeric. The indicator icon is based on a numerical value. You must define the range of values to be
used, to determine how the indicator will be displayed, by specifying a start value and an end value
for each icon in the indicator set. These start and end values can either be absolute numbers or
derived at run time from an expression. Numeric indicators are useful when you want to show how a
particular data value compares to a target value, for example, to show how a salespersons sales total
compares to a quota.

Percentage. The indicator icon is based on a percentile across a specified data group. You must
specify the start and end values for each icon in the percentage range (usually 0 to 100). The indicator
will show the appropriate icon based on where, in that percentile range, a particular instance of a
data value falls across the grouping. The percentage scale is determined by the minimum and
maximum values specified for the indicator.
MCT USE ONLY. STUDENT USE PROHIBITED
10-8 Enhancing Reports with SQL Server Reporting Services

In most cases, this is determined automatically by taking the lowest value for the specified data field
in the grouping as the minimum and the highest value as the maximum. If you override the minimum
and maximum values, in which case any rows with a value outside the range you specify will not have
an indicator icon displayed. Percentage indicators are useful when you want to show how rows in a
data grouping compare with one another, for example, to grade salespeople by the volume of sales
they have achieved.

Displaying Geographical Data with Maps


You can display geographical data in a report by
adding a map that consists of one or more layers.
Each layer can define:

Spatial data to define the geographical


features to be displayed.

Analytical data to show business values that


are related to the spatial data.

Legends and scales to help viewers interpret


the map.
Use the following procedure to add a map to a
report:

1. Drag a Map item from the Toolbox to the report body. This starts the New Map Layer wizard.

2. In the New Map Layer wizard, on the Choose a source of spatial data page, select a source for the
spatial data that will be used to define the map. This will define points, lines, or polygons that
represent spatial locations and features. You can obtain the spatial data for a map layer from:
o The map gallery provided with SQL Server Reporting Services. This contains a number of maps of
the United States.

o An Environmental Systems Research Institute (Esri)-compatible shapefile.


o A query that returns spatial data from a SQL Server database.

3. In the New Map Layer wizard, on the Choose spatial data and map view options page, select an
appropriate map resolution and size, and optionally add a Bing Maps layer to overlay the map with
geographical imagery.

4. In the New Map Layer wizard, on the Choose map visualization page, select the type of map you
want to create. The options available are:

o Basic map. This map shows spatial elements with no analytical data. Typically, a basic map shows
polygons that represent geographical areas such as states or sales territories.

o Color analytical map. You can add analytical data to a basic map and use it to format areas in
different colors to represent the analytical values. For example, you can compare figures in
multiple sales territories by coloring or shading each territory based on the sales value.

o Bubble map. You can use different-sized bubbles to indicate analytical values for each area
shown by the map. For example, each sales territory can have a bubble at its center, with the size
of the bubble indicating the relative sales volume for that territory.

5. If you choose a map visualization that requires analytical data, on the Choose the analytical dataset
page, select an existing dataset or add a new one that includes the analytical data you want to
display.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 10-9

The analytical dataset must include a field in common with the spatial dataset, so you can relate each
instance of the analytical values to a spatial feature or location. For example, you might define a map
layer containing spatial data from a Transact-SQL query that returns the ID and geographical location of
each store and analytical data from a query that returns the store ID and total sales. You can then use the
common field (the store ID) to match the analytical data (the total sales) to the spatial data (the
geographical location).

Alternatively, you might create a map layer using a US state map from the map gallery and add an
analytical dataset that includes sales volumes grouped by state code. You can then match the state code
field in the analytical dataset to the United States Postal Service (USPS) state code attribute in the US state
map dataset.

6. In the New Map Layer wizard, on the Choose color theme and data visualization page, select a
visual theme and specify which field to visualize on the map. For example, you could visualize sales
data in an analytical dataset using a color rule that colors states on the map as a shade of green,
yellow, or red depending on the sales volume for each state.

After you have added a map, you can add additional map layers, including Bing Maps imagery, by using
the Map Layer Wizard.

Note: Bing Maps layers can only be used when Internet access is available.
MCT USE ONLY. STUDENT USE PROHIBITED
10-10 Enhancing Reports with SQL Server Reporting Services

Lesson 2
Filtering Reports by Using Parameters
Reports can contain a great deal of data, and users often need to view only a subset of that information.
For example, a sales manager might want to view a report that shows sales results for a specific month or
quarter. You can use parameters to enable users to filter reports by specifying values or ranges for data
values they want to include.

This lesson describes how to use parameters to filter reports and provide a dynamic reporting solution
that empowers business users to focus on the data they need.

Lesson Objectives
After completing this lesson, you will be able to:

Describe key features of parameters.

Add parameters to a report.


Configure report parameters.

Define available and default values for parameters.

Use parameters to filter data and expressions in a report.

Overview of Parameters
Parameters enable you to make the contents of a
report dynamic depending on the parameter
values specified by the user or process making the
request. Reporting Services supports two kinds of
parameter:

Report parameters. Parameters that are


passed to a report when it is rendered. Report
parameters can be specified by:

o A user viewing the report interactively.

o A setting in a report subscription that


delivers reports automatically at a
scheduled time.

o A report that contains a parameterized sub report.


Dataset parameters. Parameters that are passed to a dataset when retrieving data. Dataset
parameters are specified in a query, and are usually used to filter the query results. For example, the
following Transact-SQL query uses a parameter named @Year in the WHERE clause to filter the
results to include only sales that occurred in a specific year:

SELECT SalesOrderNumber, SalesAmount


FROM FactInternetSales
WHERE YEAR(OrderDate) = @Year

Dataset parameters are typically mapped to report parameters, enabling report users to specify the values
used to filter the query.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 10-11

Adding Parameters to a Report


You can add parameters to a report implicitly by
creating parameterized dataset queries, or
explicitly by creating a new parameter in the
Report Data pane.

Adding a Parameter to a Dataset Query


The easiest way to add a parameter to a report is
to specify it in a dataset query. When you do this,
Report Designer automatically defines a dataset
parameter for each parameter in the query and a
report parameter mapped to each dataset
parameter. The name used for the parameter is
derived from the parameter name specified in the
query.

For example, a dataset based on the following query would result in a dataset parameter named Year and
a matching report parameter, also named Year:

Using a Parameter in a Dataset Query


SELECT SalesOrderNumber, SalesAmount
FROM FactInternetSales
WHERE YEAR(OrderDate) = @Year

Adding a Report Parameter


You can add a report parameter explicitly in the Report Data pane by clicking New and then clicking
Parameter. This enables you to add a report parameter that is not mapped to a dataset parameter. You
can then use this parameter to filter data in a tablix data region or chart by setting the Filter property to
an expression that compares a dataset field value to the parameter value.

Using an unmapped report parameter and a filter expression appears to produce the same results as using
a dataset parameter to filter the query results. However, you should be aware that a dataset containing a
parameterized query retrieves only data that matches the parameterized condition in the WHERE clause.
Un-parameterized datasets retrieve all the data requested by the query, even if that data is then excluded
from the report by a filter expression. This can affect the performance of the report.

In some scenarios, such as when using snapshots or cached datasets, you might choose to retrieve the
largest volume of data needed by a single user of the report or dataset. You could then use report-level
filters to generate reports at the required scope. In this scenario, the performance overhead of initial data
retrieval is compensated for by a reusable dataset or report, which minimizes the need for subsequent
requests to the data source.
MCT USE ONLY. STUDENT USE PROHIBITED
10-12 Enhancing Reports with SQL Server Reporting Services

Configuring Parameters
After you have added a report parameter, you
must configure the following properties:

Name. The name of the parameter.

Prompt. The label displayed in the report-


rendering user interface.

Data type. The data type of the parameter.


This can be Text, Boolean, Date/Time, Integer,
or Float.

Allowable values. You can choose to allow a


parameter value to be null or blank. You can
also configure a parameter to support
multiple values, in which case any expressions referencing the parameter must treat it as an array. Any
queries would then be used to filter results based on the parameter, using the IN operator, as shown
in the following example:

SELECT SalesOrderNumber, SalesAmount


FROM FactInternetSales
WHERE YEAR(OrderDate) IN (@Year)

Visibility. You can choose to hide a parameter or configure it as an internal parameter. These are
useful when the parameter value will be set programmatically by a client application or when the
report will always be used as a sub report and the parameter value is specified in the parent report.

Available and Default Parameter Values


You can specify a range of available values for a
report parameter and pre-assign a default value so
that the report will render without requiring the
user to enter a parameter value first.

For example, a report that shows sales by month


might use an integer parameter named Month to
filter the data to include only sales in the specified
months. You could configure the Month
parameter so that the available values are
restricted to the numbers 1 to 12, and set a default
value that sets the parameter to the appropriate
value for the current month.

You can use the following options to specify available and default values for a report parameter:

None. The user must explicitly enter a parameter value before the report can be rendered.

Specify values. The report developer can enter a list of available values and a single default value in
the Report Designer user interface. When the report is rendered, the default value specified by the
developer is used, and users can select from a list of specified available values.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 10-13

Get values from a query. The default or available values are retrieved by a dataset when the report
is requested. The report developer must specify the field in the dataset to use for the default and
available values. A different field can be chosen for the label that users select when choosing from
available values.

Parameter Datasets
When you choose to provide available and default parameter values from a query, you must create one or
more datasets that provide the parameter values. These are typically retrieved from tables in the report
data source, or generated using Transact-SQL functions. To optimize the performance of the report when
it is initially rendered, you should define default parameters that return a minimal number of rows based
on common usage of the report. For example, a sales report might be used to view sales by month,
quarter, or year based on parameters for the start and end of a time period range. By setting the default
values of these parameters to the shortest time period commonly used, you can reduce unnecessary data
retrieval. A common technique for applying minimal time period parameters across multiple reports is to
create a shared dataset that retrieves a range of commonly-used date values. That dataset can then be
used as the source for default and available parameter values for all time-filtered reports. For example, the
following query could be used to return values for:

The first day of the current year.

The first day of the previous month.

The last day of the previous month.

The first day of the current month.


The current day.

SELECT DATEFROMPARTS(YEAR(GETDATE()), 1, 1) AS FirstOfThisYear,


DATEADD(d, 1, EOMONTH(DATEADD(m, -2, GETDATE()))) AS FirstOfLastMonth,
EOMONTH(DATEADD(m, -1, GETDATE())) AS LastOfLastMonth,
DATEADD(d, 1, EOMONTH(DATEADD(m, -1, GETDATE()))) AS FirstOfThisMonth,
CAST(GETDATE() AS Date) AS Today;

Working with Parameters


When you have added parameters to a report, you
can map them to dataset parameters to filter the
report data. You can also use them in the
following ways:

Filter a data region or group by setting the


Filter property to an expression that
compares field values to parameter values. For
example, you could create a report that
retrieves the data for all sales, but filters a
table to show only sales for specific products.

Display a sub report by passing a parameter


to show a parent-child hierarchy. For example,
you could create a report showing details of a specific sales order based on an OrderID parameter,
and embed it as a sub report in a report showing all sales.
MCT USE ONLY. STUDENT USE PROHIBITED
10-14 Enhancing Reports with SQL Server Reporting Services

Reference parameters in an expression. For example, you could use the following expression to
display a dynamic report title that includes the specified Year parameter value:

="Sales for " & Parameters!Year.Value

Note: The Parameters collection referenced in the previous expression is just one of many
property collections you can use in a report. For example, the Globals collection contains
member properties such as ExecutionTime and PageNumber that you can use to display
contextual information about the report being rendered.

Demonstration: Using a Parameter


In this demonstration, you will see how to:
Add a parameter to a report.

Configure a parameter.

Set available and default values for a parameter.

Demonstration Steps
Add a Parameter to a Report

1. Ensure that you have completed the previous demonstration in this module, and maximize the
Reports Demo solution in Visual Studio.

2. In Solution Explorer, double-click the Reseller Sales.rdl report to open it in the report designer if it is
not already open, and click the Design tab.
3. On the View menu, click Report Data. Then in the Report Data pane, expand Datasets, right-click
the ResellerSales dataset, and click Query.

4. In the Query Designer dialog box, add the following WHERE clause to the existing Transact-SQL
query, and then click OK:

WHERE YEAR(FactResellerSales.OrderDate) = @Year

5. Right-click the ResellerSales dataset and click Dataset Properties. Then in the Dataset Properties
dialog box, on the Parameters tab, note that a dataset property named @Year has been created,
and click OK.

6. In the Report Data pane, expand Parameters and note that a report parameter named Year has
been created.
Configure a Parameter

1. In the Report Data pane, right-click the Year report parameter and click Parameter Properties.

2. In the Report Parameter Properties dialog box, on the General tab, in the data type drop-down
list, select Integer. Then click OK.

3. Click the Preview tab and note that the report is not rendered. If a console window opens, minimize
it. Then, in the Year text box, type 2006 and click View Report. The report is rendered with data for
sales in 2006.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 10-15

Set Available and Default Values for a Parameter

1. On the Design tab, in the Report Data pane, right-click Datasets and click Add Dataset.

2. In the Dataset Properties dialog box, perform the following steps and then click OK.

o In the Name box type SalesYears.

o Select Use a dataset embedded in my report.

o In the Data Source drop-down list, select AdventureWorksDW.

o Ensure that the Text query type option is selected.

o Enter the following Transact-SQL query:

SELECT DISTINCT YEAR(OrderDate) AS Year


FROM FactResellerSales
ORDER BY YEAR(OrderDate)

3. Repeat the previous two steps to create a dataset named LatestYear that uses the following Transact-
SQL query:

SELECT MAX(YEAR(OrderDate)) AS MaxYear


FROM FactResellerSales

4. In the Report Data pane, right-click the Year report parameter and click Parameter Properties.
5. In the Report Parameter Properties dialog box, on the Available Values tab, perform the following
steps:

o Select Get values from a query.


o In the Dataset drop-down list, select SalesYears.

o In the Value field drop-down list, select Year.

o In the Label field drop-down list, select Year.

6. On the Default Values tab, perform the following steps. Then click OK:

o Select Get values from a query.

o In the Dataset drop-down list, select LatestYear.

o In the Value field drop-down list, select MaxYear.

7. Click the Preview tab and note that the report is rendered using the most recent year in the database
(2008). Then, in the Year drop-down list, select 2007 and click View Report. The report is then
rendered with data for sales in 2007.

8. Close Visual Studio.


MCT USE ONLY. STUDENT USE PROHIBITED
10-16 Enhancing Reports with SQL Server Reporting Services

Lab: Enhancing a Report


Scenario
The sales manager at Adventure Works Cycles has reviewed the sales report you have created, and
requested the following enhancements:

The report should include a chart on the first page to summarize sales by month for each product
category, providing a visual executive summary. To accomplish this, you will add a chart to the
report in the space above the tablix data region.

Users should be able to filter the report by year, and see sales figures only for the specified year. You
will need to modify the query in the dataset used to retrieve the data to include a where clause that
filters based on a year. You will then add a parameter to the report with an appropriate default value
and list of available values.

The sales manager would like an additional report that shows a visual comparison of total sales and
monthly sales trend for product categories.
The sales manager would also like a report that shows sales by US state as a map.

Objectives
After completing this lab, you will be able to:
Add a chart to a report.

Add parameters to a report.

Add data bars and sparklines to a report.


Add a map to a report.

Estimated Time: 60 minutes

Virtual machine: 20466C-MIA-SQL

User name: ADVENTUREWORKS\Student

Password: Pa$$w0rd

Exercise 1: Adding a Chart to a Report


Scenario
The sales manager at Adventure Works has requested that you add a chart to the first page of the sales
report. The chart should show sales volumes for each product category across months of the year.

The main tasks for this exercise are as follows:

1. Prepare the Lab Environment

2. Add a Chart Report Item

3. Specify the Data for the Chart

4. Format the Chart

Task 1: Prepare the Lab Environment


1. Start the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines, and then log on to 20466C-MIA-
SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. Run Setup.cmd in the D:\Labfiles\Lab10\Starter folder as Administrator.


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 10-17

Task 2: Add a Chart Report Item


1. Open the AWReports.sln solution in the D:\Labfiles\Lab10\Starter folder with Visual Studio.

2. In the Internet Sales report, move the existing tablix data region down to make room for the chart.

3. Add a chart to the report above the tablix data region. The chart should be a stacked column chart,
and you should resize it to fit the space above the tablix data region.

Task 3: Specify the Data for the Chart


1. Configure the chart with the following data fields:

o Values: SalesAmount

o Category Groups: Month

o Series: ProductCategory

2. Edit the properties of the Month field in the Category Groups section so that the group is sorted by
the MonthNo field.

Task 4: Format the Chart


1. Edit the chart properties and change the Color palette property to the color palette of your choice.
2. Delete the chart title.

3. Format the vertical axis so that the numbers are formatted as currency with a separator for
thousands.
4. Preview the report to view your changes.

Results: After this exercise, you should have a report that includes a chart.
MCT USE ONLY. STUDENT USE PROHIBITED
10-18 Enhancing Reports with SQL Server Reporting Services

Exercise 2: Adding Parameters to a Report


Scenario
The report you have created for the sales manager currently shows sales grouped by month, but the data
is not filtered to any particular year. This means that the chart shows the total sales for each month
regardless of what year the sale occurred in. You must add a parameter so that the sales manager can
filter the report to include the years they want to view, with a default value that ensures the report only
shows sales for the most recent year.
The main tasks for this exercise are as follows:

1. Add a Parameter

2. Configure Available and Default Parameter Values

3. View a Parameterized Report in SharePoint Server

Task 1: Add a Parameter


1. Edit the query for the InternetSales dataset in the Internet Sales report so that it matches the
following Transact-SQL query. You can import this query from
D:\Labfiles\Lab10\Starter\Parameterized Sales Query.sql:

SELECT d.CalendarYear [Year],


d.MonthNumberOfYear [MonthNo],
d.EnglishMonthName [Month],
c.EnglishProductCategoryName [ProductCategory],
s.EnglishProductSubcategoryName [ProductSubcategory],
p.EnglishProductName [Product],
i.SalesOrderNumber,
i.OrderDate,
i.SalesAmount
FROM dbo.DimProductCategory c
INNER JOIN dbo.DimProductSubcategory s ON s.ProductCategoryKey = c.ProductCategoryKey
INNER JOIN dbo.DimProduct p ON p.ProductSubcategoryKey = s.ProductSubcategoryKey
INNER JOIN dbo.FactInternetSales i ON i.ProductKey = p.ProductKey
INNER JOIN dbo.DimDate d ON i.OrderDateKey = d.DateKey
WHERE d.CalendarYear IN (@Year)
ORDER BY d.CalendarYear, d.MonthNumberOfYear, c.EnglishProductCategoryName,
s.EnglishProductSubcategoryName, p.EnglishProductName, i.SalesOrderNumber

2. Edit the Year report parameter created when the dataset query was updated so that is has a data
type of Integer that allows multiple values.

3. Edit the textbox containing the report title so that is contains the following expression:

="Internet Sales for " & Join(Parameters!Year.Value, ", ")

4. Preview the report and verify that you can specify multiple year values in the parameter box by typing
2006, inserting a new line, and typing 2007. Click View Report to render the report with the
parameter values.

Task 2: Configure Available and Default Parameter Values


1. Add a dataset named SalesYears to the Internet Sales report. The data set should be embedded in
the report and use the following query (which you can import from D:\Labfiles\Lab10\Starter\Sales
Years.sql):

SELECT DISTINCT YEAR(OrderDate) [Year]


FROM FactInternetSales
ORDER BY YEAR(OrderDate) DESC
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 10-19

2. Add another embedded dataset named MaxYear that uses the following query (which you can
import from D:\Labfiles\Lab10\Starter\Max Years.sql):

SELECT YEAR(MAX(OrderDate)) [MaxYear]


FROM FactInternetSales

3. Edit the properties of the Year report parameter to get the available values from the SalesYears
dataset and the default value from the MaxYear dataset.

4. Preview the report and verify that the default parameter is used to render the report for the most
recent year, and that you can select one or more years from the available values in the Year
parameter drop-down list.

Task 3: View a Parameterized Report in SharePoint Server


1. Verify that the following properties are set for the AWReports project:

o OverwriteDatasets: True
o OverwriteDataSources: True

o TargetDatasetFolder: http://mia-sql/sites/adventureworks/Reports/Datasets

o TargetDataSourceFolder: http://mia-sql/sites/adventureworks/Reports/Data Sources


o TargetReportFolder: http://mia-sql/sites/adventureworks/Reports

o TargetReportPartFolder: http://mia-sql/sites/adventureworks/Reports/Report Parts

o TargetServerURL: http://mia-sql/sites/adventureworks
o TargetServerVersion: SQL Server 2008 R2 or later

2. Deploy the project.

3. Use Internet Explorer to browse to the SharePoint site at http://mia-sql/sites/adventureworks and


view the Internet Sales report in the Reports document library.

4. Verify that the Year parameter filters the report to show sales for the selected years.

5. Export the report to an Excel workbook named Internet Sales.xlsx in the D:\Labfiles\Lab10\Starter
folder.

6. Open the report in Excel and verify that the first worksheet includes the chart you created. Then close
Internet Explorer and Excel.
7. Keep Visual Studio open for the next exercise.

Results: At the end of this exercise, the sales report will include a parameter named Year, and two new
datasets to retrieve the available and default values for the parameter.
MCT USE ONLY. STUDENT USE PROHIBITED
10-20 Enhancing Reports with SQL Server Reporting Services

Exercise 3: Using Data Bars and Sparklines


Scenario
The sales manager wants to be able to easily compare sales by product category. You must create a report
that shows a visual comparison of total sales for the year and another report showing the trend for sales
of each category across the year.

The main tasks for this exercise are as follows:

1. Add an Existing Report

2. Configure Datasets and Parameters

3. Add a Data Bar

4. Add a Sparkline
5. Deploy Report Items

Task 1: Add an Existing Report


1. Add the existing Sales Trends.rdl report in the D:\Labfiles\Lab10\Starter folder to the AWReports
project.

2. View the Sales Trends report, and note that it includes a dataset named Sales that contains a
parameter named CalendarYear, which is mapped to a report parameter of the same name.

Task 2: Configure Datasets and Parameters


1. Convert the SalesYears and MaxYear datasets in the Internet Sales report to shared datasets.
2. Add the SalesYears and MaxDate shared datasets to the Sales Trends report.

3. Configure the CalendarYear report parameter in the Sales Trends report to use the SalesYears
dataset as a source for available values, and the MaxYear dataset as a source for the default
parameter value.

Task 3: Add a Data Bar


1. Modify the report to include a Sales Volume: label and a data bar next to each product category.

2. Add the SalesAmount field to the Values section of the data bar.

3. Configure the series properties of the SalesAmount field to use the following fill options:

o Fill style: Gradient

o Color: Light Steel Blue


o Secondary color: Cornflower Blue

o Gradient Style: Left right

4. Configure the horizontal axis properties of the SalesAmount field so that the axes are aligned at the
table1 scope.

5. Ensure that the data bar does not have a border.

6. When you preview the report, it should show a data bar that indicates the relative sales volume for
each product category.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 10-21

Task 4: Add a Sparkline


1. Modify the report to include a Monthly Trend: label and a sparkline in the cells to the right of the
data bar.

2. Add the SalesAmount field to the Values area of the sparkline, and select the
MonthNumberOfYear field in the Category Groups section.

3. Configure the series properties of the SalesAmount field to use the following fill options:

o Fill style: Solid

o Color: Cornflower Blue

4. Configure both the horizontal and vertical axes so they are aligned at the table1 scope.
5. Ensure that the sparkline does not have a border.

6. When you preview the report, it should show a sparkline detailing the sales trend for each category
across the year.

Task 5: Deploy Report Items


1. Deploy the project.

2. Use Internet Explorer to browse to the SharePoint site at http://mia-sql/sites/adventureworks and


view the Sales Trends report in the Reports document library.
3. Verify that the data bar and sparkline appear in the report. Then close Internet Explorer.

4. Keep Visual Studio open for the next exercise.

Results: After this exercise, you should have created a report that uses a data bar and a sparkline to show
a visual comparison of sales by product category.

Exercise 4: Using a Map


Scenario
You have created a report that lists sales totals for each US state. The sales manager has requested that
the report shows these sales values visually on a map.

The main tasks for this exercise are as follows:

1. Add an Existing Report

2. Add a Map

3. Format the Map

4. Deploy Report Items

Task 1: Add an Existing Report


1. Add the existing US Sales By State.rdl report in the D:\Labfiles\Lab10\Starter folder to the
AWReports project.

2. View the US Sales By State report, and note that it includes a dataset named Sales that contains a
parameter named CalendarYear, which is mapped to a report parameter of the same name.

3. Add the SalesYears and MaxDate shared datasets to the US Sales By State report.
MCT USE ONLY. STUDENT USE PROHIBITED
10-22 Enhancing Reports with SQL Server Reporting Services

4. Configure the CalendarYear report parameter in the US Sales By State report to use the SalesYears
dataset as a source for available values, and the MaxYear dataset as a source for the default
parameter value.

5. Preview the report and note that it shows sales for each US state.

Task 2: Add a Map


1. Add a Map item from the toolbox to the US Sales By State report.

2. Complete the New Map Layer wizard to create a map with the following settings:

o Use the USA by State Inset spatial data source in the map gallery.

o Use the default spatial data and map view options.

o Create a Color Analytical Map visualization.

o Use the Sales dataset as the analytical data source.

o Match the STATENAME field in the spatial data source, to the StateProvinceName column in
the analytical dataset.

o Use the Ocean theme to visualize the [Sum(SalesTotal)] field, with values shown on a color scale
from white to blue.

Task 3: Format the Map


1. Set the polygon properties of the PolygonLayer1 map layer so that [Sum(SalesTotal)] is displayed
as a tooltip.

2. Set the polygon color rule of the PolygonLayer1 map layer so that the color of each state is
displayed with the following distribution settings:

o Number of subranges: 10

o Range start: 0
3. Modify the properties of the legend above the color distribution legend at the top right to change
the title text to Sales ($).

4. Change the map title to Sales by State.

5. Remove the color scale at the bottom left.

6. When you preview the report, the map should indicates sales volume in each state by the shade of
the color used to fill the state and display tooltips for each state that show the actual sales amount.

Task 4: Deploy Report Items

1. Deploy the project, and close Visual Studio.

2. Use Internet Explorer to browse to the SharePoint site at http://mia-sql/sites/adventureworks and


view the US Sales By State report in the Reports document library.

3. Verify that the map appears in the report. Then close Internet Explorer.

4. Keep Visual Studio open for the next exercise.

Results: After this exercise, you should have created a report that shows sales by US state on a map.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 10-23

Module Review and Takeaways


In this module, you have learned how to add graphical elements to a report, and how to add and
configure parameters.

Review Question(s)
Question: What considerations can you think of for including graphical elements in a
report?
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
11-1

Module 11
Managing Report Execution and Delivery
Contents:
Module Overview 11-1

Lesson 1: Managing Report Security 11-2

Lesson 2: Managing Report Execution 11-6

Lesson 3: Subscriptions and Data Alerts 11-10

Lesson 4: Troubleshooting Reporting Services 11-17

Lab: Configuring Report Execution and Delivery 11-20

Module Review and Takeaways 11-24

Module Overview
When you provide a reporting solution for an organization, you must manage the execution and delivery
of reports. This will ensure the security of business data and provide a high-performance report viewing
and delivery experience for your users.

This module describes how to apply security settings and configure reports for delivery.

Objectives
After completing this module, you will be able to:

Configure security settings for a report server.


Configure report execution settings to optimize performance.

Use subscriptions and alerts to automate report and data delivery.

Troubleshoot reporting issues.


MCT USE ONLY. STUDENT USE PROHIBITED
11-2 Managing Report Execution and Delivery

Lesson 1
Managing Report Security
Reports contain business data, so it is important to ensure that your reporting solution applies appropriate
security restrictions to protect sensitive information. This lesson discusses security configuration for
Microsoft SQL Server Reporting Services.

Lesson Objectives
After completing this lesson, you will be able to:

Describe the considerations for securing a reporting solution.

Configure authentication for a report server.


Apply permissions to implement authorization in a report server.

Configure secure network communication for a report server.

Introduction to Report Server Security


SQL Server Reporting Services provides a flexible
security model enabling you to apply security
settings to protect business data. While the
specific details of security models can vary
between organizations, every reporting solution
must address the following fundamental security
considerations:

Authentication
Your reporting solution should authenticate users
to verify their identity. Authentication usually
involves validating user credentials, such as a user
name and password, before allowing the user to
access the reporting solution. SQL Server Reporting Services supports a number of authentication
mechanisms, which you can configure to match the requirements of your organizations application
infrastructure.

Authorization
After verifying the identity of each user, your reporting solution must authorize access to specific reports,
data sources, or other objects based on assigned permissions. SQL Server Reporting Services supports a
role-based authorization model in which you can apply permissions to a specific set of roles, and assign
users to those roles to authorize access to report items.

Secure Communication
When users access a report server, data is transmitted across network connections. In some cases, you
should consider using encrypted connections to ensure that data cannot be intercepted and read by a
third party.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 11-3

Managing Authentication
SQL Server Reporting Services supports a number
of authentication models.

By default, users connecting to a report server are


authenticated based on their Microsoft Windows
credentials, but you can edit the
RSReportServer.config file to enable any of the
following authentication models:

RSWindowsNegotiate. This is the default


authentication model for report servers in
native mode and authenticates users based on
their Windows credentials using Kerberos
where supported. In environments where
Kerberos is not supported, this authentication model falls back to Windows NT LAN Manager (NTLM)
authentication.

RSWindowsNTLM. This authentication model uses NTLM access tokens to authenticate users, and
does not support delegation of credentials (in which the report server can impersonate the user to
access resources on another server).

RSWindowsKerberos. This authentication model uses a Kerberos ticket to identify a user. The users
credentials can be delegated across multiple servers in the same domain. Kerberos authentication
requires specific infrastructure configuration, including registering Service Principle Names (SPNs) for
service accounts.
RSWindowsBasic. This authentication model uses HTTP basic authentication to authenticate user
access to the report server. Credentials are passed in base64 clear text, so you should ensure that your
report server is configured to use an encrypted channel in this model.
Custom. You can use a custom model that employs ASP.NET Forms authentication to validate users.
At the HTTP layer, requests are treated as anonymous and redirected to your custom module.

Managing Authorization
Access to reporting items such as reports, data
sources, data sets, and report parts is based on
permissions assigned to roles. The specific roles
and permissions you can use to secure a report
server depend on the mode in which it is installed.

Using Roles in Native Mode


When a report server is installed in native mode,
you use system-level roles to control access to the
report server and server-level functionality, such as
shared schedules.
MCT USE ONLY. STUDENT USE PROHIBITED
11-4 Managing Report Execution and Delivery

Item-level roles are used to control access to individual report items. Reporting Services includes the
following predefined roles:

System-level roles

o System User. Members can access the report server.

o System Administrator. Members can administer the report server.


Item-level roles

o Content Manager. Members have full control of items, including the ability to manage
permissions.

o Publisher. Members can add, update, view, and delete items

o Report Builder. Members can use Report Builder to create and edit items.

o Browser. Members can view items and create subscriptions.


o My Reports. Members can manage reports in a personal folder named My Reports.

If necessary, you can customize the individual permissions assigned to the predefined roles. Additionally,
you can define your own custom roles and assign specific permissions to them.
Item-level role membership is assigned for individual folders and items. By default, role membership
assignments are inherited by subfolders, but you can override inherited role memberships by creating
custom role assignments at any level in a folder hierarchy.

Using Groups in SharePoint Integrated Mode


When a report server is installed in Microsoft SharePoint Integrated mode, access to report items is
controlled through permissions granted to SharePoint groups in document libraries where report items
are supported. The following table shows how SharePoint groups relate to native mode Reporting
Services roles:

Reporting Services
Permission SharePoint Group
Role

Access the report System User There is no equivalent SharePoint group for this
server permission. Access to reporting items is based on
permissions to access the SharePoint farm.

Manage the report System Administrator There is no equivalent SharePoint group for this
server permission. The report server can be managed by
any user with administrative access to the
SharePoint farm.

Full control of Content Manager Use the Owners group to enable a user to
items manage reporting items in a SharePoint
document library.

Add, update, view, Publisher Use the Members group to enable a user to
and delete items publish, update, view, and delete reporting items
in a SharePoint document library.

Use Report Builder Report Builder There is no equivalent SharePoint group for this
permission. Users assigned to the Owners and
Members groups can use Report Builder to create
and edit items.

View Reports Browser Use the Visitors group to enable a user to view
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 11-5

Reporting Services
Permission SharePoint Group
Role
reporting items in a SharePoint document library.

Use the My My Reports There is no equivalent SharePoint group for this


Reports folder permission because the My Reports folder is not
supported in SharePoint integrated mode. You
can use the My Site feature of SharePoint Server
to provide similar functionality.

Managing Secure Communication


Users generally access a report server with a web
browser such as Windows Internet Explorer.
When this is performed across the Internet or any
network where data must be secure while in
transit, you can configure the report server to use
Secure Sockets Layer (SSL) security to encrypt
network communication between the user and the
report server.

Configuring SSL Security in Native Mode


When a report server is installed in native mode,
you can use the following procedure to enable SSL
for secure network communication:

1. Install a server certificate for SSL communication.

You can obtain a certificate from a trusted certificate-issuing authority, or you can issue your own
certificates and configure clients to trust them.

2. Bind the certificate to each Reporting Services endpoint for which you want to enable secure
communication.

Use Reporting Services Configuration Manager to bind the certificate and specify the port to be used for
secure communication. You must bind the Reporting Services Web Service and Report Manager URLs
individually.

Reporting Services does not use Internet Information Services (IIS) to handle HTTP requests. If IIS is
installed on a server where Reporting Services is configured to use SSL, the W3SVC service must be
running.

Configuring SSL Security in SharePoint Integrated Mode


When a report server is running in SharePoint Integrated mode, there is no specific procedure for
enabling SSL communication for Reporting Services. You can use the following procedure to enable SSL
communication for the entire SharePoint site:

1. Install a server certificate for SSL communication.

2. Bind the certificate to the SharePoint website in IIS.


MCT USE ONLY. STUDENT USE PROHIBITED
11-6 Managing Report Execution and Delivery

Lesson 2
Managing Report Execution
After you have published report server items, you can configure them to optimize report execution. The
specific optimization techniques you can use depend on a number of factors, including the authentication
credentials used to access report data sources and the acceptable of latency data in the reports.

This lesson describes common ways to manage report execution.

Lesson Objectives
After completing this lesson, you will be able to:

Configure credentials for a shared data source.

Cache reports and data sets to optimize performance.

Use snapshots to create a report history.

Configuring Data Source Credentials


You can manage a shared data source in Report
Manager or a SharePoint document library, and
configure it to use any of the following credential
settings when accessing data:

Windows integrated security. Use this


setting when you want the data source to
impersonate the Windows identity of the
current user when connecting to the database
server. You can use this setting for database
servers that are installed on the report server
or on another server. If the credentials must
be impersonated across more than one other
server, then Kerberos authentication is required.

Credentials supplied by the user running the report. Use this setting when you want to prompt
the user to enter credentials when they view the report. The data source will then pass these
credentials to the database to allow access to the data for the report. You can specify that the
credentials entered by the user should be treated as Windows credentials or as a user name and
password for a non-integrated authentication model, such as a SQL Server login.

Credentials stored securely on the report server. Use this setting when you always want to access
the database using the same credentials, regardless of the user requesting the report. Credentials are
stored in an encrypted format on the report server, and users do not need to know the user name
and password to access the database. This configuration is required when you want to cache reports
or datasets that use the data source, or create data-driven subscriptions for reports that use it.

Credentials are not required. Use this setting for databases or other data sources that do not
require any authentication.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 11-7

Optimizing Performance with Caching


Reports containing a large amount of data or
complex functions can be slow to render. To
improve performance, you can use the caching
functionality in Reporting Services to cache
datasets and reports when they are first accessed,
so they can be rendered more quickly for
subsequent requests. You can also use a refresh
plan to re-cache reports so they are already in
cache the first time they are requested.

Cached reports and datasets are stored in


memory, so data is not retrieved from the data
source when the report is rendered. Additionally, if
a report or dataset includes parameters, a cached instance is created for each distinct parameter
combination requested. Caching is not suitable in the following scenarios:

When real-time reporting is required every time the report is viewed.


If a report has a large number of parameters or a wide range of commonly requested parameter
value combinations.

If the data source for a report or dataset must use Windows Integrated authentication or credentials
provided by the user making the request.

You can enable caching for a report or dataset in the Report Manager of a SharePoint document library.
When you configure caching for a report or dataset, you can use the following options to specify when
the cached instance should expire:

After a specified period of time. For example, you could cache a report for one hour after it is
rendered. The cached instance then expires and a subsequent request will result in a fresh cached
instance.

On a report or dataset-specific schedule. For example, you could configure a report to be cached
until midnight each day. A request after midnight will render a fresh copy.
On a shared schedule. For example, if you need to cache multiple reports and datasets, you can use
a shared schedule to ensure they all expire at the same time.

Additionally, a report or dataset may be removed from the cache at any time if it is replaced with a new
version.
MCT USE ONLY. STUDENT USE PROHIBITED
11-8 Managing Report Execution and Delivery

Using Report Snapshots


Caching is a technique you can use to optimize
report performance. However, there may be cases
where you need greater predictability about the
specific point in time that data in a report
represents. In such scenarios, you can use
snapshots to create a report history and enable
users to view specific versions of a report relating
to the time the snapshot was created.

Snapshots are created for specific combinations of


parameters and require that the credentials used
by the report data sources are stored securely on
the server. Additionally, when using snapshots, you
should not map report parameters to data set parameters. Instead, use a data set query that returns all
the data, and apply filters in the report, based on report parameters.

You can configure snapshots for a report in Report Manager or a SharePoint document library, and have
them generated on a scheduled basis. You can also create a snapshot on demand when viewing the
history of a report in Report Manager or SharePoint server.

Demonstration: Configuring Report Execution


In this demonstration, you will see how to:

Configure Credentials for a Data Source.


Configure Caching for a Report.

Demonstration Steps
Configure Credentials for a Data Source
1. Ensure that the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are started, and log onto
20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. In the D:\Demofiles\Mod11 folder, run Setup.cmd as Administrator.

3. Start Visual Studio, and then open the Report Demo.sln solution in the D:\Demofiles\Mod11 folder.
Then on the Build menu, click Deploy Solution. When deployment is complete, close Visual Studio.

4. Start Internet Explorer and browse to SharePoint site at http://mia-sql/sites/adventureworks.

5. In the Reports document library, click Demo, and then click Data Sources to view the contents of
the Data Sources folder.

6. Click the ellipsis () for the AdventureWorksDW data source, then in the AdventureWorksDW.rds
information panel, click the ellipsis () and click View Dependent Items.

7. Note that the Reseller Sales report has a dependency on this data source. Then click Close.

8. Click the ellipsis () for the AdventureWorksDW data source, then in the AdventureWorksDW.rds
information panel, click the ellipsis () and click Edit Data Source Definition.

9. In the Credentials section of the configuration page, note that the data source is currently
configured to use the Windows authentication (Integrated) or SharePoint user option.

10. In the Credentials section, select the Stored credentials option and enter the following credentials:
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 11-9

o User Name: ADVENTUREWORKS\ServiceAcct

o Password: Pa$$w0rd

11. Select Use as Windows credentials and click Test Connection. Then when the connection has been
tested successfully, click OK.

Configure Caching for a Report


1. In the SharePoint site, click Reports and click Demo to view the contents of the Demo folder.

2. Click the ellipsis () for the Reseller Sales report, then in the Reseller Sales.rdl information panel,
click the ellipsis () and click Manage Processing Options.

3. In the Data Refresh Options section, select the Use cached data option, and then in the Cache
Options section, select On a custom schedule and click Configure.

4. In the Frequency section, select Day, in the Schedule section select all the days and set the Start
time to 12:00 am, and click OK. Then click OK again to set the processing options and return to the
Demo folder.

5. Click Reseller Sales and note the execution date and time under the report heading.
6. At the top of the report page, click the Demo link to return to the Demo folder, and then click
Reseller Sales again. Note that the execution date and time have not changed because the report
has been cached.
7. Keep Internet Explorer open for the next demonstration.
MCT USE ONLY. STUDENT USE PROHIBITED
11-10 Managing Report Execution and Delivery

Lesson 3
Subscriptions and Data Alerts
In addition to optimizing report execution for interactive viewing, you can configure reports for automatic
delivery through subscriptions. Users can receive reports automatically by email or in a shared folder or
SharePoint document library, and can be sent email notifications of specific report data conditions
through data alerts.

Lesson Objectives
After completing this lesson, you will be able to:

Describe how subscriptions and data alerts can be used to push report data to users.

Subscribe to a report.

Create a data-driven subscription for a report.

Create a data alert.


Manage data alerts.

Introduction to Subscriptions and Data Alerts


Business users can view reports on-demand, but
they often require a push-oriented reporting
solution where reports are delivered to them
automatically on a regularly scheduled basis.
Additionally, users might only be interested in
viewing reports under specific data conditions,
and need to be alerted to those conditions before
browsing to the report.

SQL Server Reporting Services meets these needs


through two features subscriptions and data
alerts.

Using Subscriptions to Deliver Reports


You can use subscriptions to deliver reports automatically on a scheduled basis. By creating subscriptions,
you can deliver reports in any supported rendering format to the following locations:

An email message.

A file share.

A SharePoint document library.

For example, you could use a subscription to send a weekly sales report to the sales manager as a
Microsoft Excel attachment in an email message, or you could save a product catalog report as a Word
document in a SharePoint document library every month.

Using Data Alerts to Notify Users of Report Data Conditions


You can use data alerts to notify users on specific report data conditions by sending an email message.
For example, each regional sales manager could request a data alert every month if the monthly sales
report contains a regional sales total that is less than a specified amount.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 11-11

Note: Data alerts are only available in SharePoint Integrated mode.

To send subscriptions and data alerts by email, the report server must be configured with the address of
an SMTP server. For report servers in native mode, you can configure this setting in Reporting Services
Configuration Manager. When using SharePoint Integrated mode, you can configure email settings for
the report server service application in SharePoint Central Administration.

Subscribing to a Report
You subscribe to a report by creating a standard
subscription. In most cases, business users create
their own standard subscriptions, though
administrators can also use standard subscriptions
to deliver reports to email distribution lists, shared
folders, or document libraries.
Use the following procedure to create a standard
subscription:

1. View the report in Report Manager or


SharePoint Server.

2. In Report Manager, click the Subscriptions


tab and then click Subscribe. In SharePoint Server, on the Actions menu, click Subscribe.

3. Specify the appropriate options for your subscription.

Options for a subscription include:

The delivery extension. This specifies how the subscription will be delivered, and can be one of the
following:

o Email.

o Windows file share.


o SharePoint library (only available in SharePoint Integrated mode).

Extension-specific options. The options available depend on the delivery extension selected. For
example, if you select the email extension, you must specify an email address to which the report is
sent, a subject for the message, and other email-related options.

The report format. This option determines the rendering format of the report. For example, you
could send the report as a Microsoft Excel workbook or a Microsoft Word document.
The delivery schedule. You can schedule a subscription to deliver the report when a snapshot is
refreshed, on a subscription-specific schedule, or on a shared schedule.

Parameter values. If the report includes parameters, you must specify the values to be used when
delivering the subscription.

After you have created a standard subscription, you can view and manage your subscriptions in Report
Manager or SharePoint Server. Use one of the following techniques to view the Manage Subscriptions
page for a report:

In Report Manager, on the drop-down menu for the report, click Manage. Then click the
Subscriptions tab.
In SharePoint Server, on the menu for the report, click Manage Subscriptions.
MCT USE ONLY. STUDENT USE PROHIBITED
11-12 Managing Report Execution and Delivery

Note: Standard subscriptions are available in SQL Server Standard Edition or higher.

Creating a Data-Driven Subscription


In some scenarios, you might want to create a
subscription that delivers the same report to
multiple recipients, each with their own preferred
options. Additionally, you might configure those
options dynamically when the subscription is
processed. To accomplish this, you can create a
data-driven subscription that determines delivery
options based on values in a database.

For example, you might need to deliver a sales


report to each regional sales manager, with the
data filtered using a parameter for the region.
Additionally, each salesperson might want to
receive the report in a different format. Instead of creating multiple standard subscriptions to meet this
requirement, you could create a single data-driven subscription that obtains the email address, region
parameter value, and preferred report format for each sales manager from a table of subscription data.

Data-driven subscriptions are generally created by administrators, and require a query to retrieve delivery
options from a database. Use the following procedure to create a data-driven subscription:

1. View the Manage Subscriptions page for the report.

2. In Report Manager, click New Data-Driven Subscription. In SharePoint Server, click Add Data-
Driven Subscription.

3. Specify a subscription name, and a data source and query for the subscription data.

4. Specify the delivery extension for the subscription. You can choose any of the delivery extensions
available for standard subscriptions. Additionally, a null delivery extension is available, which enables
you to use a subscription to pre-cache the report.

5. Specify subscription options as static values or fields from the subscription data.

6. Specify values for any report parameters. You can use default values, specify static values, or specify
values fields from the subscription data.

7. Specify a schedule for the subscription.

Note: Data-driven subscriptions are supported in SQL Server Enterprise Edition or higher.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 11-13

Demonstration: Using Subscriptions


In this demonstration, you will see how to:

Subscribe to a Report.

Create a Data-Driven Subscription.

Demonstration Steps
Subscribe to a Report

1. Ensure that you have completed the previous demonstration in this module.

2. In the SharePoint site at http://mia-sql/sites/adventureworks, view the Reseller Sales report.

3. On the Actions menu, click Subscribe.

4. In the Delivery Extension drop-down list, select E-Mail, and then enter the following settings:

o To: student@adventureworks.msft

o Comment: Report attached Include a link to the report: Selected

o Show report inside message: Selected


o Format: Excel

5. In the Delivery Event area, select On a custom schedule. Then click Configure and define a custom
schedule that will send the report daily, two minutes from the current time, and click OK. You can
determine the current system time by starting a command prompt window and entering the
command time /T.

6. In the Parameters area, ensure that Use Report Default Value is selected. Then click OK.

7. At the top of the report page, click the Demo link to return to the Demo folder.

8. Click the ellipsis () for the Reseller Sales report, then in the Reseller Sales.rdl information panel,
click the ellipsis and click Manage Subscriptions.
9. Wait two minutes and then refresh the page. The Last Results column should indicate that the
subscription has run and the report was sent as an email message. Then minimize Internet Explorer.

10. View the contents of the C:\intepub\mailroot\Drop folder and double-click the email message that
has been received by the local SMTP server to open it in Microsoft Outlook.

11. Read the email message and open the attached Excel file to view the report. Then close Excel and the
email message.

Create a Data-Driven Subscription

1. In the D:\Demofiles\Mod11 folder, double-click Subscription Table.sql to open it in SQL Server


Management Studio. Each time you are prompted, use Windows authentication to connect to the
database engine on the localhost server.

2. View the Transact-SQL code and note that it creates and populates a table named
ReportSubscriptionsDemo, which contains the following columns:

o SubscriptionID a unique primary key.

o RecipientEmail the email address of a subscription recipient.

o ReportFormat the format in which the report should be rendered.

o Linked a Boolean value that indicates whether the subscription email message should include a
link to the report on the report server.
MCT USE ONLY. STUDENT USE PROHIBITED
11-14 Managing Report Execution and Delivery

3. Click Execute to run the query. Then when it has completed, close SQL Server Management Studio.

4. Maximize Internet Explorer, and in the Manage Subscriptions page for the Reseller Sales report,
click Add data-Driven Subscription.

5. In the Description text box, type Weekly Sales Report. Then, in the Connection Type section, select
Shared data source. In the Data Source Link section, click the ellipsis () button, and in the Select
an Item dialog box, click the Data Sources folder, select the AdventureWorksDW data source and
click OK.

6. In the Query section, type SELECT * FROM ReportSubscriptionsDemo and click Validate. When the
query is validated successfully, click Next.

7. In the Year section, ensure that Use report default value is selected, and click Next.

8. In the Delivery Type section, ensure that E-Mail is selected. Then set the following configuration
values and click Next:

o To: Select a value from the database (select RecipientEmail).

o Include Report: True.


o Render Format: Select a value from the database (select ReportFormat).

o Subject: Specify a static value (enter Weekly sales report).

o Comment: Specify a static value (enter The weekly sales report is attached).

o Include Link: Select a value from the database (select Linked).

9. In the Delivery Event section, ensure that On a custom schedule is selected.

10. In the Frequency section, select Day.

11. In the Schedule section, select the current day and enter a time that is two minutes later than the
current time. You can determine the current system time by starting a command prompt window and
entering the commands and time /T. You can also use the command echo %date% to determine the
current day and date.

12. Click Finish and view the subscription details.

13. Wait for two minutes and then refresh the page. When the subscription has been processed, the Last
Results column should contain the message Done: 3 processed of 3 total; 0 errors.

14. View the contents of the C:\inetpub\mailroot\Drop folder and note the new email messages that have
been received by the local SMTP server.
15. Open the three most recent messages, and verify that the report has been sent in Excel, Word, and
embedded HTML formats.

16. Close all attachments, email messages, and folder windows. Keep Internet Explorer open for the next
demonstration.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 11-15

Creating a Data Alert


Business users can use the following procedure to
create a data alert for a report, and be notified by
email if the data in the report meets specified
conditions:

1. View the report in SharePoint server, and on


the Actions menu, click New Data Alert.
2. Select the data feed in the report containing
the data you want to be notified about.
Typically, the data feed is a data region such
as a tablix or a chart.

3. Specify if you want to be notified when the


report includes data that meets the conditions defined in the alert, or when it does not contain data
that meets the conditions in the report.

4. Add one or more rules to determine the conditions for the alert. Rules are defined by selecting a field
in the data feed, a comparison operator (such as is or is greater than), and a value to compare the
field to. You can combine multiple comparisons to create complex rules.

5. Specify a schedule for the alert. Typically, you should discourage users from creating alerts that run
more frequently than daily, as this can have a negative effect on report server performance.

6. Specify the email settings for the alert, including the recipient address, subject, and description.

After you have saved the data alert, it will be processed at the specified interval. If the report includes data
that meets the conditions you specified in the alert, an email message will be sent to the specified email
address.

Managing Data Alerts


Users can use the following procedure to view and
manage their data alerts.

1. In SharePoint server, on the drop-down menu


for a report, click Manage Data Alerts. This
displays the Data Alert Manager page and
shows any alerts that the current user has
defined on the report.

2. If you want to review alerts for reports other


than the one you clicked, select the
appropriate option in the View alerts for
report drop-down list.

3. View the data alerts to see the number of notifications that have been sent for each alert, the date
and time each alert was last run, and the status of each alert.

4. Right-click any alert to edit, delete, or run it.


MCT USE ONLY. STUDENT USE PROHIBITED
11-16 Managing Report Execution and Delivery

Demonstration: Creating a Data Alert


In this demonstration, you will see how to:

Create a Data Alert.

Manage Data Alerts.

Demonstration Steps
Create a Data Alert

1. In the SharePoint site at http://mia-sql-sites/adventureworks, click Reports, click Demo, and click
Reseller Sales to view the Reseller Sales report.

2. On the Actions menu, click New Data Alert.

3. In the New Data Alert Reseller Sales dialog box, in the Report data name drop-down list, select
table1.

4. Click Add rule and click Country_Region1, then in the drop-down list for the rule value, click United
States. This creates a rule that sends an alert of the report table including the value United States in
the Country_Region1 text box. This demonstrates the importance of assigning meaningful names to
report elements.

5. Under Schedule settings, change Daily to Minute, and ensure that the alert is scheduled for every 1
minute(s).

6. Expand Advanced, and if necessary, change the Start alert on date and time value to before the
current time.
7. In the Email settings section, change the Recipient(s) value to student@adventureworks.msft,
and then click Save.

8. Minimize Internet Explorer and view the contents of the C:\inetpub\mailroot\Drop folder. Then wait
for a minute and refresh the folder until a new email message appears.

9. Double-click the new message to view the alert. Then close the message and the folder window.

Manage Data Alerts

1. Maximize Internet Explorer, and above the report, click the Demo link to view the demo folder.

2. Click the ellipsis () for the Reseller Sales report, then in the Reseller Sales.rdl information panel,
click the ellipsis () and click Manage Data Alerts.

3. Note that the alert you created previously is listed and that the Last Run and Status columns provide
information about when the alert was last sent.

4. Right-click the alert, and on the shortcut menu that appears, click Delete.

5. Close Internet Explorer.


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 11-17

Lesson 4
Troubleshooting Reporting Services
A reporting solution can quickly become an indispensable part of an organizations infrastructure. It is
therefore extremely important that Business Intelligence (BI) professionals are able to troubleshoot
problems and monitor performance for Reporting Services.

Lesson Objectives
This lesson describes the primary locations of troubleshooting and performance information you can use
to manage the health of a Reporting Services solution. After completing this lesson, you will be able to:

Describe the log files you can use to troubleshoot Reporting Services issues.

Describe the performance counters you can use to monitor Reporting Services performance.

Reporting Services Logs


If you need to troubleshoot an issue with a report
server, you can find information to help you
diagnose the problem by enabling logging and
viewing data in the logs. There are three general
categories of logging you can use to troubleshoot
Reporting Services.

Execution Logging
Execution logging records statistics about report
execution. Before you can use execution logging
to troubleshoot an issue with Reporting Services,
you must enable it.

For report servers in native mode, you can enable execution logging on the Logging page of the
properties of the report server in SQL Server Management Studio.

For report servers in SharePoint Integrated mode, you can enable execution logging in the System
Settings page for the reporting services application in SharePoint Central Administration.
You can enable execution logging in normal or verbose mode.

To view execution logging information, query the ExecutionLog, ExecutionLog2, and ExecutionLog3
views in the report server database.

Trace Logging
Trace logging records information about errors and warnings as well as general diagnostic data. By
default, Reporting Services logs trace data to a file named ReportServerService_<timestamp>.log in the
\Microsoft SQL Server\<SQL Server Instance>\Reporting Services\LogFiles folder. Additionally, for report
servers in SharePoint Integrated mode, you can specify reporting services events to be logged to the
SharePoint Unified Logging Service (ULS) trace log.

To configure logging in the Reporting Services trace log file, edit the DefaultTraceSwitch and RSTrace
settings in ReportingServicesService.exe.config, which is in the \Program Files\Microsoft SQL Server
\MSRS11.<instance name>\Reporting Services\ReportServer\bin folder.
MCT USE ONLY. STUDENT USE PROHIBITED
11-18 Managing Report Execution and Delivery

To configure SharePoint ULS logging for Reporting Services, edit the SQL Server Reporting Services
events to be logged in the Configure Diagnostic Logging page in the Monitoring section of SharePoint
Central Administration.

HTTP Logging
HTTP logging records details of HTTP requests and responses made to the report server. This information
can be useful when diagnosing problems associated with user requests.

For report servers in native mode, you can log HTTP data in a file named
ReportServerService_HTTP_<timestamp>.log in the \Microsoft SQL Server\<SQL Server Instance>
\Reporting Services\LogFiles folder. To enable HTTP logging, you must add http:4 to the RSTrace section
of the ReportingServicesService.exe.config file, where you can also modify the HttpTraceFileName
attribute to customize the name of the log file.
For report servers in SharePoint Integrated mode, you can use the IIS log files to view information about
HTTP requests and responses.

Monitoring Reporting Services Performance


Although you can use data in Task Manager and
Windows Event Viewer to view Reporting Service
activity, Performance Monitor is the primary tool
for monitoring Reporting Services.

Monitoring the Reporting Services


web service
The Reporting Services web service manages
interactive reporting activity. For Reporting
Services instances in native mode, you can use the
counters in the MSRS 2011 Web Service object
to monitor the number of requests, cache usage,
and processing activity for interactive reports
viewed in Report Manager. When Reporting Services is deployed in SharePoint Integrated mode, the
MSRS 2011 SharePoint Mode Web Service object provides the same counters.

Monitoring the Reporting Services Windows service


The Reporting Services Windows service manages scheduled reporting activity, such as cache refreshes,
snapshot creation, data alerts, and subscription processing. For Reporting Services instances in native
mode, you can use the counters in the MSRS 2011 Windows Service object to monitor this activity.
When Reporting Services is deployed in SharePoint Integrated mode, the MSRS 2011 SharePoint Mode
Windows Service object provides the same counters.

Monitoring HTTP and memory activity


To monitor the number of HTTP requests and the amount of data transferred, you can use the counters in
the ReportServer:Service object, or the ReportServerSharePoint:Service object if the server is
configured in SharePoint Integrated mode. This object also includes the Memory Pressure State counter,
which indicates the memory pressure for Reporting Services, based on the thresholds discussed in the
previous topic. The Memory Pressure State counter can have one of the following values:

1: None

2: Low

3: Medium
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 11-19

4: High

5: Maximum exceeded
MCT USE ONLY. STUDENT USE PROHIBITED
11-20 Managing Report Execution and Delivery

Lab: Configuring Report Execution and Delivery


Scenario
Some business users at Adventure Works Cycles have reported that the sales report takes a long time to
render, and have asked you to improve its performance.

Users also want to be able to subscribe to the report and have it delivered by email.

Additionally, some executives have requested that the report should be sent to them by email but each
executive wants the report in a different format.

Objectives
After completing this lab, you will be able to:

Configure report execution.

Implement a standard subscription.

Implement a data-driven subscription.

Estimated Time: 30 minutes


Virtual machine: 20466C-MIA-SQL

User name: ADVENTUREWORKS\Student

Password: Pa$$w0rd

Exercise 1: Configuring Report Execution


Scenario
You have developed a reporting solution for Adventure Works Cycles that includes a report used to view
sales for each product category by month. As the year progresses, the sales report can include a large
volume of data and be unacceptably slow to render. Business users can accept a latency of one day when
viewing the report, so you have decided to improve rendering performance by caching the report and
refreshing the cached instance at midnight each day.

The main tasks for this exercise are as follows:


1. Prepare the Lab Environment

2. Configure a Shared Data Source

3. Configure Report Caching

Task 1: Prepare the Lab Environment


1. Start the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines, and then log on to 20466C-MIA-
SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. Run Setup.cmd in the D:\Labfiles\Lab11\Starter folder as Administrator.

3. Use Visual Studio to open and deploy the AWReports.sln solution in the D:\Labfiles\Lab11\Starter
folder. When deployment is complete, close Visual Studio.

Task 2: Configure a Shared Data Source


1. In the http://mia-sql/sites/adventureworks SharePoint site, in the Reports\Data Sources folder, view
the dependent items of the AdventureWorksDW data source and confirm that this data source is
used by the Internet Sales report.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 11-21

2. Edit the data source definition for the AdventureWorksDW data source and configure it to use the
following stored Windows credentials:

o User Name: ADVENTUREWORKS\ServiceAcct

o Password: Pa$$w0rd

Task 3: Configure Report Caching


1. In the Reports document library in the http://mia-sql/sites/adventureworks SharePoint site, configure
the processing options for the Internet Sales report to use cached data. The cached report should
expire on a custom schedule at midnight every day.

2. View the report and note the execution time. Then wait a few minutes and view the report again,
verifying that the execution time has not changed since the previous view because a cached instance
is rendered.

Results: After this exercise, you should have configured a shared data source to use stored credentials,
and configured a report to display a cached instance.

Exercise 2: Implementing a Standard Subscription


Scenario
The senior sales executive at Adventure Works wants to receive the sales report in Microsoft Excel
format by email. To accomplish this, you intend to have the executive create a subscription. You also want
to test the functionality before providing instructions to the executive.

The main tasks for this exercise are as follows:

1. Subscribe to a Report

2. Verify the Subscription

Task 1: Subscribe to a Report


1. Subscribe to the Internet Sales report with the following subscription settings:

o Delivery Extension: E-Mail.

o To: student@adventureworks.msft.

o Comment: The sales report is attached.

o Show report inside message: Selected.


o Format: Excel.

o Delivery Event: On a custom schedule in two minutes from the current time. You can determine
the current system time by starting a command prompt window and entering the command
time /T.

o Parameters: Use report default value.

Task 2: Verify the Subscription


1. View the Manage Subscriptions page for the Internet Sales report.

2. View the information about the subscription you created in the previous task. Then wait for two
minutes and refresh the page until the Last Results column indicates that mail was sent to
student@adventureworks.msft.

3. View the contents of the C:\inetpub\mailroot\Drop folder and verify that an email message was sent.
MCT USE ONLY. STUDENT USE PROHIBITED
11-22 Managing Report Execution and Delivery

4. Open the email message and verify that it contains the report in Microsoft Excel format.

Results: After this exercise, you should have created a standard subscription that delivers a report by
email.

Exercise 3: Implementing a Data-Driven Subscription


Scenario
A number of business users want to receive the sales report by email. However, the users have expressed a
preference for a variety of formats and subscription options. You have decided to use a data-driven
subscription to deliver the report to these users.

The main tasks for this exercise are as follows:

1. Create a Table of Subscription Data

2. Create a Data-Driven Subscription

3. Verify the Subscription

Task 1: Create a Table of Subscription Data


1. Execute the Subscription Table.sql script file in the D:\Labfiles\Lab11\Starter folder on the localhost
instance of the SQL Server database engine.

2. Note that this script creates a table named ReportSubscriptions, which contains the following
columns:

o SubscriptionID a unique primary key.

o RecipientEmail the email address of a subscription recipient.

o ReportFormat the format in which the report should be rendered.


o Linked a Boolean value that indicates whether the subscription email message should include a
link to the report on the report server.

Task 2: Create a Data-Driven Subscription


1. Create a data-driven subscription for the Internet Sales report.

2. Give the report the description Weekly Sales Report, and configure it to use the
AdventureWorksDW shared data source to retrieve subscription data with the following query:

SELECT * FROM ReportSubscriptions

3. Use the report default value for the Year parameter.

4. Use the E-Mail delivery type with the following settings:

o To: Select a value from the database (select RecipientEmail).

o Include Report: True.

o Render Format: Select a value from the database (select ReportFormat).

o Subject: Specify a static value (enter Weekly sales report).

o Comment: Specify a static value (enter The weekly sales report is attached).

o Include Link: Select a value from the database (select Linked).


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 11-23

5. Schedule delivery on a custom schedule on the current day, two minutes from the current time. You
can determine the current system time by starting a command prompt window and entering the
commands and time /T. You can also use the command echo %date% to determine the current day
and date.

Task 3: Verify the Subscription


1. View the details for the subscription you created in the previous task. Then wait for two minutes and
refresh the Manage Subscriptions page until the Last Results column indicates that the subscription
has been processed.

2. View the contents of the C:\inetpub\mailroot\Drop folder and verify that the email messages were
sent.

3. Open the email messages and verify that they contain the report.

Results: After this exercise, you should have created a data-driven subscription that delivers a report to
multiple recipients, in multiple formats by email.
MCT USE ONLY. STUDENT USE PROHIBITED
11-24 Managing Report Execution and Delivery

Module Review and Takeaways


In this module, you have learned how to configure report execution settings, and how to use data alerts
and subscriptions to deliver reports.

Review Question(s)
Question: You want to reduce the time it takes to render a report containing a lot of data.
How can you achieve this?
MCT USE ONLY. STUDENT USE PROHIBITED
12-1

Module 12
Delivering BI with SharePoint PerformancePoint Services
Contents:
Module Overview 12-1

Lesson 1: Introduction to SharePoint Server as a BI Platform 12-2

Lesson 2: Introduction to PerformancePoint Services 12-9

Lesson 3: PerformancePoint Data Sources and Time Intelligence 12-12

Lesson 4: Reports, Scorecards, and Dashboards 12-16

Lab: Implementing a SharePoint Server BI Solution 12-23

Module Review and Takeaways 12-29

Module Overview
SharePoint Server is an increasingly important part of the end-to-end solution for the centralized delivery
of Business Intelligence (BI) solutions. SharePoint Server provides a platform that makes it easier for
business users to share and collaborate on a wide range of information. It provides PerformancePoint
Services as a platform for delivering BI through dashboards containing business performance scorecards
and reports.

Objectives
After completing this module, you will be able to:
Describe SharePoint Server as a BI platform.

Use PerformancePoint Services to deliver BI functionality.

Configure PerformancePoint Data Sources.

Create Reports, Scorecards, and Dashboards.


MCT USE ONLY. STUDENT USE PROHIBITED
12-2 Delivering BI with SharePoint PerformancePoint Services

Lesson 1
Introduction to SharePoint Server as a BI Platform
SharePoint Server can play an integral part of a SQL Server 2014 business intelligence solution. It is
important for BI professionals to understand how SharePoint services can be employed to deliver BI
solutions to executives and business users.

Lesson Objectives
After completing this lesson, you will be able to:

Describe how SharePoint Server fits into a BI project.

Describe SharePoint Server.


Describe the required SharePoint services to support BI.

Describe the SharePoint farm topology options.

Describe sites and subsites for BI.

What Is SharePoint Server?


SharePoint Server makes it easier for users to
collaborate and work together. Previous releases
of SharePoint Server provided a web-based portal
that featuring file sharing and document
management. However, with each release, these
capabilities have been expanded to include the
creation of social networks, search, and BI
capabilities. SharePoint Server uses a Microsoft
Office user interface that makes it intuitive for
business users, and provides a centralized platform
that IT departments can manage.

SharePoint Server is available in the following form


factors:
On-premises. You can install and configure SharePoint Server on a dedicated Windows system. This
enables you to customize the solution for the business. The web front-end and application layers can
be virtualized using Windows Hyper-V.

Cloud. Office 365 provides SharePoint capabilities and is offered as a service to which your
organization can subscribe. This solution is useful to organizations without the expertise to implement
a full SharePoint environment.

When planning a SharePoint Server BI solution, there are three tiers of the architecture to consider:

Web front-end tier. One or more servers that are used to accept requests for a SharePoint
application/service and direct to the appropriate application server.
Application tier. One or more servers that host the service applications in the SharePoint Server
infrastructure.

Data tier. A SQL Server instance, which can be clustered, that hosts SharePoint databases.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 12-3

Each tier can contain multiple servers to meet the business requirements for performance, scalability,
and/or availability. The key point is that all servers must belong to the same SharePoint farm, a logical
grouping of servers that provides the infrastructure for a SharePoint Server solution.

SharePoint terminology
Before you start planning a BI solution that uses SharePoint Server, it is important to understand all the
core components and terminology involved. The following list defines some important concepts:

SharePoint Server farm. A farm is a collection of servers that work together to provide SharePoint
services. Each server in the farm hosts one or more SharePoint Server components, and the entire
farm constitutes a logical container for all the SharePoint services provided by those servers and the
core unit of administration.

SharePoint databases. SharePoint Server is primarily a platform for publishing and collaborating on
content. The content in a SharePoint site, together with farm configuration data and application
settings, is stored in one or more SQL Server databases.

Service applications. SharePoint Server provides an extensible platform that can deliver a broad
range of services. Each service is encapsulated in an application, which can be hosted on one or more
application servers in the SharePoint Server farm.

Web applications. These are Internet Information Services (IIS) applications where users can
consume SharePoint Server services. The services available in a specific web application are
determined by associating application services in the farm with the web application.

Site collection. As its name suggests, this is a collection of SharePoint sites hosted in a web
application. You can use a site collection as a central unit of management and configuration for
multiple sites. SharePoint Server supports site collection features, which can be enabled or disabled at
the site collection level.
Site. A site is a container for related content, and provides a specific endpoint to which users can
browse. Sites inherit the features of their parent site collection, each having features that can be
enabled or disabled on a site-by-site basis.

Apps. Site content is delivered through visual elements, which are known as apps. SharePoint Server
includes apps, such as document libraries and lists, which you can use to create the user interface for
the site. Service applications and third-party software developers can provide additional apps.
Subsites. In many cases, you can deliver all the content you need to in a site. However, you can also
group related content into subsites under a parent site. Subsites inherit the features of their parent
site.

SharePoint Farm Topology Options


As part of the technical architecture and
infrastructure design phase, a SharePoint farm
topology should be designed in line with the
business requirement. This section describes the
topologies that can be configured for a single
SharePoint farm.

Single Server
In this farm topology, all the SharePoint Server
architecture layers are hosted on a single Windows
Server. The main benefit of this model is that the
MCT USE ONLY. STUDENT USE PROHIBITED
12-4 Delivering BI with SharePoint PerformancePoint Services

licensing cost for the solution is minimized. Typically, this type of configuration is found in development
or training environments, and provides the easiest setup of a SharePoint farm. However, because all three
layers run on the same Windows server and share the same hardware, there can be increased contention
of resources. If this affects the performance of business reports, consider implementing a scale-out
solution.

Scale Out
In this farm topology, each of the SharePoint Server architecture layers is separated onto different
Windows servers. Scaling out a SharePoint farm distributes the workload across multiple servers, reducing
contention on a single server, and improving throughput. This does, however, come with an additional
licensing cost, and more infrastructure preparation is required to seamlessly manage security across the
SharePoint farm. There is also no resilience if one of the servers shuts down. Scale out should be
considered as a valid topology if performance is important without the need for resilience.

High Availability
In this farm topology, the SharePoint architecture layers are separated across Windows servers, and then
each layer is duplicated onto another server. This is the most expensive topology to implement, but it
provides load balancing and high availability across the entire SharePoint farm. The infrastructure
preparation is similar to that of creating a scale-out architecture. However, a Network Load Balancer is
also required to distribute incoming requests to the first available web front-end server.

Note: Domain Name Services (DNS) is used to resolve host names to IP addresses. In a high
availability topology, a DNS record must be created to represent the name of the SharePoint site
against the IP address of the Network Load Balancer. It is important that an A resource record is
used if the IP address is using IP version 4, or a AAA resource record if the IP address has IP
version 6. Otherwise, the DNS administrator can create a Canonical Name (CNAME) record. These
types of records do not work with SharePoint farms using Windows authentication.

SharePoint Services for BI


After SharePoint Server is installed on each server
in the SharePoint farm, use SharePoint Central
Administration to configure the services and
components that are supporting BI activities
within SharePoint Server.

Site collection and site


The first step after installing SharePoint server is to
create a site collection and at least one site. This
will provide the web portal through which users
will access BI applications and documents. For BI
purposes, ensure you define a site name that is
understood by business users and use the Business
Intelligence Center template to ease the configuration of BI components within the SharePoint site.

Excel Services
After the site is defined, configure Excel Services. This should be done for two reasons it enables your
business users to share and collaborate with Excel files, and it is also a prerequisite service for PowerPivot
for SharePoint. Excel Services is configured on the application layer servers within a SharePoint farm.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 12-5

Claims to Windows Token Service


This service is required when a service such as Excel Services has to communicate with a remote data
source hosted outside the SharePoint farm. Within a SharePoint farm, communication takes place
between the farm servers, using claims-based authentication. Typically, a user connects to a SharePoint
farm using Windows authentication and must connect to the remote data source with the same
credential.

The Claims to Windows Token Service is responsible for converting a Windows authentication token to a
claims-based token for incoming connections into the SharePoint farm. It then converts the outgoing
traffic from the SharePoint farm from a claims-based token to a Windows authentication token. Because
this service deals with the sensitive task of handling authentication tickets, the account running this
service should be added to the Local Administrators group on the server on which it is hosted.
Additionally, Local Security Policy configuration is required in Windows to enable the following rights:

Act as part of the operating system.

Impersonate a client after authentication.


Log on as a service.

For a BI implementation of a SharePoint farm, the Claims to Windows Token Service must be configured
on the same server on which Excel Services is installed.

PowerPivot
PowerPivot provides a way for users to create and share Excel workbooks containing self-service data
models. This service requires running SQL Server Setup on each application server on the SharePoint farm,
which will create a PowerPivot instance of Analysis Services on which the PowerPivot workspaces can be
created when using the application. After it is installed, SharePoint Central Administration can be used to
complete the integration with SharePoint Server.

Note: To learn more about PowerPivot, attend course 20467C: Designing Self-Service
Business Intelligence and Big Data Solutions.

Reporting Services Integrated mode


In SQL Server 2014, Reporting Services in SharePoint Integrated mode is a shared service within
SharePoint. It can be completely managed within SharePoint Server. Like PowerPivot for SharePoint,
Reporting Service in SharePoint Integrated mode requires the use of SQL Server Setup to install Reporting
Services in SharePoint Integrated mode on the application servers. It also requires installing the Reporting
Services add-in on the web front-end servers. When complete, SharePoint Central Administration can be
used to complete the integration with SharePoint Server.

PerformancePoint Services
PerformancePoint Services is installed during SharePoint Server setup on the application servers on the
farm. In the context of a BI solution, selecting the Business Intelligence Center template when creating a
site means that only minimal configuration within Central Administration is required to start this service.

Additional Services
In addition to the services already discussed, the following services are recommended because of the
value they can add to a BI solution:

Search service
As more documents are added to the SharePoint farm, it can become cumbersome to manually search for
documents. Enabling the SharePoint Search service will catalog the content of a SharePoint site so that
the Search feature can be used to quickly retrieve documents.
MCT USE ONLY. STUDENT USE PROHIBITED
12-6 Delivering BI with SharePoint PerformancePoint Services

Secure store service


The secure store service provides an alternative method of authentication. It is similar to storing
credentials in a ReportServer database in Reporting Services native mode. However, the secure store
service is available to a wider range of applications. The secure store service greatly simplifies the
configuration of authentication for many services, including PerformancePoint Services and PowerPivot.

Microsoft Office Web Apps


Office Web Apps provide web-based Microsoft Office products, including Word, Excel and PowerPoint.
They provide basic functionality that allows a user to make changes to an Office document through a web
browser.

SharePoint logging
Logging is an extremely useful component of SharePoint to enable, particularly during the installation and
configuration of the SharePoint farm. Any configuration errors are reported to the Unified Logging
Service (ULS) log file. This is located in the C:\Program Files\Common Files\Microsoft Shared\Web Server
Extensions\15\LOGS folder. If there is a problem in the SharePoint farm, open the latest file and perform a
search for the word error.

Note: As best practice, it is recommended that each service runs under its own separate
domain user account. This tightens security within the SharePoint farm, and also makes it easier
to troubleshoot any configuration errors, because the error logs record the name of the service
account. To make it easier to run the lab for this module, all services run under the same account,
but you should not use this approach on production servers.

Sites and Subsites for BI


SharePoint Server provides a web portal through
which business users can access the data stored on
a SharePoint site. The flexible nature of SharePoint
means that a site collection can be created with
one site that can be used by the entire
organization. However, for large enterprises, you
can also create subsites to provide further
separation of the business data from an
organizational and security perspective. Ultimately,
the business requirements will determine which
type of users can access specific data. If security is
a concern when gathering the requirements,
subsites may be needed.

After a site or a subsite is created, depending on the template selected, a site structure will be created
with specific apps and features enabled. When you use the Business Intelligence Center template, default
folders are created for PerformancePoint data sources and dashboards. You can also add additional
document libraries and PowerPivot Galleries to support reports and PowerPivot workbooks.

Before you can create a subsite in a SharePoint Server site, you must activate:

The SharePoint Server Publishing Infrastructure site collection feature.

The SharePoint Server Publishing site feature.


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 12-7

Demonstration: Creating a Business Intelligence Subsite


In this demonstration, you will see how to:

Enable SharePoint Publishing.

Create a Subsite.

Demonstration Steps
Enable SharePoint Publishing

1. Ensure that the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are started, and log onto
20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. In the D:\Demofiles\Mod12 folder, run Setup.cmd as Administrator. If the script prompts you to
confirm the deletion of a SharePoint site, press Enter.

3. Start Internet Explorer and browse to http://mia-sql/sites/adventureworks. The first time you browse
to this site, it may take a few minutes to open.

4. In the title bar for the home page, next to Student, click the Settings icon and in the menu, click Site
settings.

5. On the Site Settings page, under Site Collection Administration, click Site collection features.

6. On the Site Collection Features page, in the SharePoint Server Publishing Infrastructure row, if
the feature is not already active, click Activate, and then wait for the Active indicator to appear.

Note: The feature can take a few minutes to activate.

7. At the top of the Site Collection Features page, click Site Settings to return to the Site Settings
page.

8. Under Site Actions, click Manage site features.

9. On the Site Features page, in the SharePoint Server Publishing row, if the feature is not already
active, click Activate, and then wait for the Active indicator to appear.

10. At the top of the Site Features page, click Adventure Works Portal to return to the home page.

Create a Subsite

1. On the Adventure Works Portal home page, in the Quick Launch pane on the left, click Site
Contents.

2. At the bottom of the Site Contents page, click new subsite.

3. On the New SharePoint Site page, under Title, in the text box, type Demo BI Portal.

4. Under Description, in the text box, type A subsite for BI.

5. In the URL name text box, type bi, so that the URL for the new subsite is http://mia-
sql/sites/adventureworks/bi.

6. In the Template Selection area, under Select a template, on the Enterprise tab, click Business
Intelligence Center.

7. At the bottom of the page, click the Create button. After a short time, the Demo BI Portal site is
displayed.

8. Select the URL in the Internet Explorer navigation bar, right-click it, and then click Copy.

9. In the Quick Launch area, click Home.

10. On the home page, under the Quick Launch area, click EDIT LINKS, and then click link.
MCT USE ONLY. STUDENT USE PROHIBITED
12-8 Delivering BI with SharePoint PerformancePoint Services

11. In the Add a link dialog box, in the Text to display box, type BI Portal, right-click the Address box,
click Paste, and then click OK.

12. Under link, click Save.

13. In the Quick Launch area, click the new BI Portal link and verify that the Demo BI Portal site is
displayed.

14. In the Demo BI Portal, in the title bar for the home page, next to Student, click the Settings icon,
and then click Site settings.

15. On the Demo BI Portal Site Settings page, under Look and Feel, click Navigation.
16. On the Navigation Settings page, in the Current Navigation section, select Structural Navigation:
Display only the navigation items below the current site. At the top of the page, click OK. Note
that the Quick Launch area now only shows links for the items in the BI Portal subsite, and not for in
the parent site.

17. Click the image above the Quick Launch area, this provides a navigation link to the home page of the
subsite.

18. At the top of the page, click Adventure Works Portal. This returns you to the parent site home page.

19. Close Internet Explorer.


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 12-9

Lesson 2
Introduction to PerformancePoint Services
PerformancePoint Services is available within SharePoint Server Enterprise edition, and enables the
creation of highly visual reports without the need to install SQL Server. However, adding this capability
side-by-side with SQL Server BI components, will provide a complete BI solution within the organization.

Lesson Objectives
After completing this lesson, you will be able to:

Describe PerformancePoint Services.

Describe how to enable PerformancePoint Services in SharePoint Server.


Describe the Dashboard Designer.

Describe PerformancePoint data sources.

Describe KPIs, reports, scorecards, and dashboards.

What is PerformancePoint Services?


PerformancePoint Services is a feature within
SharePoint Server that enables the creation of
KPIs, reports, scorecards, and dashboards without
the need to install SQL Server. Although many of
these features can be created by using the BI
elements of SQL Server 2014, the tighter
integration of PerformancePoint Services within
SharePoint Server 2013 provides convenience for
creating BI items. The Dashboard Designer
provides the starting point for users to create
highly visual reports. This is fully integrated into
SharePoint Server 2013.

PerformancePoint Services is only available in SharePoint Enterprise edition, and is automatically enabled
when the Business Intelligence template is selected when creating a site. It provides a graphical
environment designed to make it easy for users to create dashboards and reduces the development time
required to create a report compared to tools such as Reporting Services. However, Reporting Services
provides more flexibility in defining the layout of a report or dashboard. PerformancePoint Services
dashboards have fixed layout, and although a choice of layouts is presented in the Dashboard Designer, it
is not as flexible as Reporting Services.
Combining PerformancePoint Services with technologies such as Reporting Services and PowerPivot gives
the business more versatility in creating BI reports. This increases the capability to create a wide range of
reports for multiple audiences.
MCT USE ONLY. STUDENT USE PROHIBITED
12-10 Delivering BI with SharePoint PerformancePoint Services

Configuring PerformancePoint Services


When the SharePoint server farm is created, you
must perform a number of steps to ensure that
PerformancePoint Services are available.

Ensure that the correct versions of the


ADOMD.Net and AMO client libraries
are installed
If you are installing PerformancePoint Services on
a server that does not have Excel Services or
PowerPivot configured, you must install the
ADOMD.Net component on the server that will
host PerformancePoint Services. This can be found
in the SQL Server 2014 SP1 feature pack.
Additionally, if you plan to import KPIs from SQL Server Analysis Services, you should install version 10.0
of the AMO library, which is available in the SQL Server 2008 R2 feature pack.

Configure the PerformancePoint application pool


Start the PerformancePoint Services service in the SharePoint Server farm. PerformancePoint Services runs
in the context of an application pool, and you can set this in the properties of the PerformancePoint
Services service application in SharePoint Central Administration. In most scenarios, you should create a
dedicated service account and application pool for each SharePoint service, including PerformancePoint
Services.

Configure an unattended service account if required


To enable PerformancePoint to access data sources, an unattended account can be defined to connect up
on behalf of users. PerformancePoint Services uses Secure Store Service to store the unattended service
account password. Before using the unattended service account, make sure that Secure Store has been
configured. If the users identity must be identified when connecting to a back-end data source, Kerberos
delegation and impersonation can also be configured to work with PerformancePoint.

Demonstration: Configuring an Unattended Service Account


In this demonstration, you will see how to:

Configure an Unattended Service Account.

Demonstration Steps
Configure an Unattended Service Account

1. Ensure that you have completed the previous demonstration in this module.
2. On the Start screen, type SharePoint and start the SharePoint 2013 Central Administration app.
When prompted to allow the program to make changes, click Yes.

3. In SharePoint Central Administration, under Application Management, click Manage service


applications.

4. In the list of service applications, click PerformancePoint Services Application. Make sure you click
the link for the application, and not the link for its proxy.

5. On the Manage PerformancePoint Services page, click PerformancePoint Service Application


Settings.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 12-11

6. On the PerformancePoint Service Application Settings page, ensure that Unattended Service
Account is selected, and if a Change User button is present, click it.

7. Enter the following user credentials, and then click OK:

o User Name: ADVENTUREWORKS\ServiceAcct

o Password: Pa$$w0rd
8. Close the SharePoint Central Administration application.

Dashboard Designer
The Dashboard Designer is the primary tool used
by end users to create PerformancePoint data
sources and content elements. Dashboard
developers can open the Dashboard Designer
from the SharePoint ribbon, on the
PerformancePoint tab.
The Dashboard Designer contains a workspace
browser that enables the creation of data sources
and PerformancePoint content. The data sources
support SharePoint Lists, Excel Services, SQL Server
tables and Excel workbooks, as well as
multidimensional (Analysis Services) data sources.
The PerformancePoint content provides support for reporting templates, including KPIs, reports,
scorecards, and dashboards.
MCT USE ONLY. STUDENT USE PROHIBITED
12-12 Delivering BI with SharePoint PerformancePoint Services

Lesson 3
PerformancePoint Data Sources and Time Intelligence
PerformancePoint Services is a platform for displaying BI information in a SharePoint Server site. To
display this information, it must connect to the data sources where the data is stored. Additionally, users
generally want to view data in the context of specific time periods, so PerformancePoint data sources
often include configuration settings that map specific date values in the data source to relative time
periods.

Lesson Objectives
After completing this lesson, you will be able to:

Create a Data Source for PerformancePoint Services.

Configure Time Intelligence for a PerformancePoint data source.

Create a Time Intelligence filter.

PerformancePoint Data Sources


PerformancePoint can use data from the following
sources:

SQL Server Analysis Services.

SQL Server database engine.


Excel Services.

Excel Workbook.

SharePoint List.

You can create a data source for each source you


plan to use. Source-specific connection
information, such as the server name, database,
and cube required to use data in SQL Server Analysis Services, is also available.

In addition to the connection string information, you must specify how PerformancePoint Services will be
authenticated when connecting to the data source on behalf of a SharePoint Server user. You can use one
of the following authentication options:

Unattended Service Account. This setting causes PerformancePoint Services to make the connection
using the unattended service account specified in SharePoint Central Administration.

Use a Stored Account. This setting causes PerformancePoint Services to use credentials that are
stored in the SharePoint Server secure store service.

Per-User identity. This setting causes PerformancePoint to use the identity of each individual user
when connecting to the data source.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 12-13

Data Source Time Intelligence Settings


PerformancePoint Services uses time intelligence
to support filtering of data based on relative time
periods such as this year, last year, this
month, this quarter, this quarter last year, and
so on.

To support this functionality, each data source


must include settings that map data values to
specific time intervals, and an indication of how
time values in the data sources map to now.
When connecting to an Analysis Services data
source, you can enable time intelligence by
specifying the following settings:
Time Dimension. The cube dimension that should be used to aggregate measures by time.

Reference Member. A member in the time dimension that is mapped to a specific reference date.

Reference Date. The calendar date to which the reference member is mapped.

Time Member Associations. Hierarchy levels in the time dimension and the specific calendar
intervals that they represent, for example, year, semester, quarter, month, and day.

Time Intelligence
When data sources have been configured with
Time Intelligence settings, PerformancePoint
Services enables users to filter data based on a
relative time period that is specified using Time
Intelligence functions and expressions, including:
Day (today)

Month (this month)

Month-1 (last month)

Year (the current year)

YearToDate (the current year to date)

Year-1 (last year)

Additional Reading: For more information about PerformancePoint Time Intelligence functions,
go to http://technet.microsoft.com/en-us/library/ff701696.aspx.
MCT USE ONLY. STUDENT USE PROHIBITED
12-14 Delivering BI with SharePoint PerformancePoint Services

Demonstration: Using Time Intelligence


In this demonstration, you will see how to:

Create a Data Source.

Configure Time Intelligence Settings.


Use Time Intelligence Functions.

Demonstration Steps
Create a Data Source

1. Ensure you have completed the previous demonstrations in this module.

2. Start Internet Explorer and browse to http://mia-sql/sites/adventureworks. In the Quick Launch area,
click BI Portal.

3. In the Quick Launch area, click Data Connections.

4. On the Ribbon, on the PerformancePoint tab, click Dashboard Designer.

5. In the Internet Explorer prompt to open the file, click Open. If the Application Run Security
Warning dialog box is displayed, click Run.

Note: The Dashboard Designer can take a few minutes to open.

6. In the Dashboard Designer, in the Workspace Browser pane, right-click Data Connections, and then
click New Data Source.

7. In the Select a Data Source Template dialog box, under Template, click Analysis Services, and
then click OK.

8. When the new data source is created, rename it to DemoDB.

9. Under Connection Settings, in the Server text box, type MIA-SQL\SQL2. In the Database drop-
down list, click DemoDB, and in the Cube drop-down list, click Internet Sales.

Configure Time Intelligence Settings

1. On the DemoDB page, click the Time tab.

2. In the Time Dimension drop-down list, click Order Date.Calendar Date.

3. In the Choose a date to begin the year box for the selected time dimension, click Browse.

4. In the Select Members dialog box, select January 1, 2008, and then click OK.

5. In Hierarchy level list, click Day.


6. In the Enter a date that is equal to the period specified by the reference member above list,
select January 1 of the current year.

7. In the Time Member Associations pane, create the following mappings:


o Calendar Year: Year

o Calendar Quarter: Quarter

o Month: Month

o Date: Day

8. In the Workspace Browser pane, right click DemoDB, and then click Save.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 12-15

Use Time Intelligence Functions

1. In Dashboard Designer, in the Workspace Browser pane, click PerformancePoint Content. On the
ribbon, on the Create tab, click Filter.

2. In the Select a Filter Template dialog box, select Time Intelligence, and click OK.

3. On the Select a data source page, click Add Data Source, select the DemoDB data source and click
OK, and then click Next.

4. On the Enter Time Formula page, in the Formula column, type Year, and in the Display Name
column type Current Year.
5. In the new empty row under the values you entered, in the Formula column, type Year-1 and in the
Display Name column type Last Year.

6. Click Preview, and note the dimension members from the data source that are mapped to these
functions. Then click Close, and on the Enter Time Formula page, click Next.

7. On the Select Display method page, select List, and click Finish.

8. Rename the new filter to Year.


MCT USE ONLY. STUDENT USE PROHIBITED
12-16 Delivering BI with SharePoint PerformancePoint Services

Lesson 4
Reports, Scorecards, and Dashboards
PerformancePoint Services is primarily designed to display visualizations of BI information in SharePoint
Server pages. These visualizations are implemented as reports or scorecards, and displayed as dashboards.

Lesson Objectives
After completing this lesson, you will be able to:

Create PerformancePoint Services reports.

Create PerformancePoint Services scorecards.

Create PerformancePoint Services dashboards.

Reports
Reports provide an interactive, graphical
representation of data that can be displayed on a
dashboard page. You can create many kinds of
report with PerformancePoint Services, including:
Analytic charts and grids.

Excel Services worksheets and PivotTables.

Strategy Maps.
KPI Details.

SQL Server Reporting Services reports.

Dashboard Designer provides a graphical report


design environment in which you can create reports by dragging measures and dimension hierarchies
from an Analysis Services data source. One of the key benefits of using PerformancePoint Services to
create reports is that provide drill-down interactivity. For example, you might produce a report containing
a pie chart showing sales revenue by product category based on a dimension hierarchy that includes
category, subcategory, and product levels. When the report is displayed on a SharePoint Server site, users
can click the pie segment for a particular category, and the chart will be redrawn to show sales for
subcategories in that category. Clicking a subcategory segment redraws the chart to show sales for
products in that subcategory.

You can create SQL Server Reporting Services reports from servers in native or SharePoint Integrated
mode. For each report, you must specify its location and parameter values. You can also choose to display
the Report Viewer toolbar and parameters pane if you wish.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 12-17

Demonstration: Creating Reports


In this demonstration, you will see how to:

Create an Analytic Chart Report.

Create a Reporting Services Report.

Demonstration Steps
Create an Analytic Chart Report

1. Ensure you have completed the previous demonstrations in this module.

2. In Dashboard Designer, on the ribbon, on the Create tab, click Analytic Chart.

3. In the Create an Analytic Chart Report dialog box, on the Workspace tab, click DemoDB, and then
click Finish.
4. When the report is created, rename it to Sales Chart.

5. In the Details pane on the right of the screen, expand Measures.

6. Drag the Revenue measure to the Bottom Axis area.

7. In the Details pane, expand Dimensions, and expand Product Category.

8. Drag the EnglishProductCategoryName attribute hierarchy to the Series area.

9. In the Series area, click the EnglishProductCategoryName drop-down arrow. Then, in the Select
Members dialog box, select All and click OK.

10. In the Details pane, under Dimensions, expand Order Date.

11. Drag the Calendar Date dimension hierarchy to the Background area.
12. Right-click Sales Chart, and then click Save.

Create a Reporting Services Report

1. In Dashboard Designer, on the ribbon, on the Create tab, click Reporting Services.

2. Rename the new report to Sales Report.

3. In the Sales Report pane, in the Server mode drop-down list, select SharePoint Integrated.

4. In the SharePoint Site box, type http://mia-sql/sites/adventureworks.

5. In the Document Library drop-down list, select Reports.

6. In the Report drop-down list, select Sales Trends.rdl.

7. Clear the Show toolbar check box, and ensure that Show parameters is not selected. In the Report
parameters table, note that the CalendarYear parameter has been set to use its default value.

8. In the Workspace Browser pane, right-click Sales Report, and then click Save.
MCT USE ONLY. STUDENT USE PROHIBITED
12-18 Delivering BI with SharePoint PerformancePoint Services

Scorecards
A scorecard is a collection of KPIs that enables
users to drill down into hierarchies to identify
specific areas of the business that are over or
under-performing against the target. For example,
a scorecard could show the sales revenue KPI
discussed earlier in this topic aggregated by sales
region. At the top level, the scorecard shows sales
revenue performance against the target for the
company as a whole, but users can expand the
scorecard to view performance for individual
regions.

A scorecard can contain multiple KPIs that


measure different aspects of business performance to provide an overall view of how the organization is
meeting its targets. For example, a scorecard might include KPIs for sales revenue, profitability, and
productivity levels based on hours of continuous operation for plant machinery. The KPIs can each be
weighted to reflect their relative importance to the overall goals of the business, and a total score can
then be calculated. This approach is often referred to as a balanced scorecard, because it balances
multiple factors to provide a high-level view of how the business is performing.

Scorecard KPIs
KPIs display visual objects that measure numeric metrics against a target. In PerformancePoint Services,
you can use the Dashboard Designer to create KPIs that compare an actual value against a target
value. Both the actual and target values are associated with data sources and formulae, and can be
formatted with an appropriate number format. You can also define threshold percentages that determine
the icons to use when comparing the actual value to the target value.
For example, a business requirement might be to track sales revenue performance with a year-on-year
growth target of 10 percent. To support this requirement, you might create a KPI with the following
characteristics:
The actual value uses a data source that applies the YearToDate time-intelligence function to the
sales revenue measure. This results in a figure that shows the sales revenue for the current year so far.

The target value uses the same data source to create a variable named LastYearSales that is based
on the formula YearToDate-1, which returns the sales revenue figure for the year-to-date period in
the previous year. The target value is then defined as LastYearSales * 1.1 (in other words, last years
revenue to date plus 10 percent).

Both values are formatted as currency.

Thresholds are configured to show:

o A red indicator if the actual value is less than 75 percent of the target value.

o A yellow indicator if the actual value is between 75 and 95 percent of the target value.

o A green indicator if the actual value is above 90 percent of the target value.

You can define KPIs in the Dashboard Designer, basing them on values such as measures from an Analysis
Services data source. Additionally, you can import KPIs that are already defined in Analysis Services cubes,
and format them as required in Dashboard Designer.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 12-19

Demonstration: Creating a Scorecard


In this demonstration, you will see how to:

Create a Scorecard from Analysis Services Data.

Edit KPI and Scorecard Settings.

Demonstration Steps
Create a Scorecard from Analysis Services Data

1. Ensure you have completed the previous demonstrations in this module.

2. In Dashboard Designer, in the Workspace Browser pane, click PerformancePoint Content. On the
ribbon, on the Create tab, click Scorecard.

3. In the Select a Scorecard Template dialog box, select Analysis Services, and click OK.
(If the Select a Scorecard Template dialog box does not appear, delete the new scorecard that is added,
click the Office button and click Designer Options, and select the Use wizards to create scorecards
check box. Then click Save and repeat this step).
4. On the Select a data source page, select the DemoDB data source and click Next.

5. On the Select a KPI Source page, select Create KPIs from SQL Server Analysis Services measures,
and click Next.
6. On the Select KPIs to Import page, click Add KPI.

7. Set the following values for the new KPI and click Next:

o Name: Margin KPI


o Actual: Margin

o Band Method: Increasing is Better

o Targets: Target Margin


8. On the Add Measure Filters page, click Next.

9. On the Add Member Columns page, click Next.

10. On the Locations page, click Finish.

11. When the scorecard is added, rename it to Sales Scorecard.

Edit KPI and Scorecard Settings

1. In the Workplace Browser pane, select Margin KPI.

2. In the Margin KPI pane, click Target.

3. In the Thresholds pane, note the thresholds and indicators that have been defined for the KPI value.

4. In the Workplace Browser pane, select Sales Scorecard.

5. In the Details pane, expand Dimensions, expand Geography, and then drag
EnglishCountryRegionName to the right edge of the Margin KPI cell.

6. In the Select Members dialog box, select All and then click OK.

7. On the ribbon, on the Edit tab, click Update. The scorecard is updated to show the KPI indicator for
all geographies.

8. Expand All, and note that the KPI is shown for each country or region. Then collapse All so that only
the summary is shown.
MCT USE ONLY. STUDENT USE PROHIBITED
12-20 Delivering BI with SharePoint PerformancePoint Services

Dashboards
Dashboards are PerformancePoint components
that enable the user to bring together multiple
PerformancePoint objects in one place.

Dashboard Page Layouts


A dashboard page provides preset layouts that
allow you to choose which PerformancePoint
object should populate a particular area. There are
different page layouts, including the following:

1 Zone

2 Columns

2 Rows

3 Columns
3 Rows

Column, Split Column

Header, 2 Columns

Dashboard Connections
A key benefit of creating a dashboard is that, in addition to a single view of high-level business
performance data, the various data elements on the dashboard can be linked. For example, clicking a
column for product category sales revenue in a column chart might filter a different chart to show
profitability and productivity data for the selected category.

These links are implemented as connections between the different elements in the dashboard, in which a
property value from one element is used to set another property value in a connected element.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 12-21

Demonstration: Creating a Dashboard


In this demonstration, you will see how to:

Create a Dashboard.

Create Connections.
Deploy a Dashboard.

Demonstration Steps
Create a Dashboard

1. Ensure you have completed the previous demonstrations in this module.

2. In Dashboard Designer, on the ribbon, on the Create tab, click Dashboard.

3. In the Select a Dashboard Page Template dialog box, click 3 Rows, and then click OK.

4. When the new dashboard is created, rename it to Sales Dashboard.

5. In the Pages pane, select Page 1 and rename it to Sales Performance.

6. In the Details pane, expand Filters, and then expand PerformancePoint Content.

7. Drag the Year filter to the Top Row area of the dashboard.

8. In the dashboard page, right-click the Center Row area and click Split Zone.

9. In the Details pane, expand Scorecards, and then expand PerformancePoint Content.
10. Drag Sales Scorecard to the Center Row area of the dashboard, which should be on the left side of
the page.

11. In the Details pane, expand Reports, and then expand PerformancePoint Content.

12. Drag the Sales Chart report to the Zone 1 area of the dashboard, which should be to the right of the
Sales Scorecard element.

13. Drag Sales Report to the Bottom Row area of the dashboard.

Create Connections

1. In the Center Row area, in the Sales Scorecard drop-down list, click Create Connection.

2. In the Connection dialog box, in the Get value from list, select Top Row (1) Year, and on the
Values tab, in the Connect to list, select TI formula, and in the Source value list, select Formula.
Then click OK.

3. In the Zone 1 area, in the Sales Chart drop-down list, click Create Connection.

4. In the Connection dialog box, in the Get value from list, select Top Row (1) Year; and on the
Values tab, in the Connect to list, select Order Date.Calendar Date, and in the Source value list,
select Member Unique Name. Then click OK.
5. In the Bottom Row area, in the Sales Report drop-down list, click Create Connection.

6. In the Connection dialog box, in the Get value from list, select Top Row (1) Year, and on the
Values tab, in the Connect to list, select CalendarYear, and in the Source value list, select Display
Value. Then click OK.
MCT USE ONLY. STUDENT USE PROHIBITED
12-22 Delivering BI with SharePoint PerformancePoint Services

Deploy a Dashboard

1. In the Workspace Browser pane, right-click Untitled Workspace, and then click Save.

2. Save the workspace as Sales Workspace in the D:\Demofiles\Mod12 folder.

3. In the Workspace Browser pane, right-click Sales Dashboard, and then click Deploy to SharePoint.

4. In the Deploy To dialog box, expand Demo BI Portal, click Dashboards, and then click OK. The
dashboard is uploaded to SharePoint Server and opened in a new tab in Internet Explorer. Note that
Sales Report might take longer to load than the other dashboard elements.

5. View the dashboard, noting the information it contains. Note that, at smaller screen resolutions, you
may need to scroll within each dashboard element to see all the data.

6. In the Year drop-down list, select Last Year and note that the dashboard elements are all updated to
reflect sales data for 2007.

7. In the Sales Chart report, hold the mouse over the column (which shows all revenue) and view the
details in the tooltip. Then click the column and note that the chart updates to show the revenue for
each product category.

8. Click the drop-down arrow at the upper-right of the chart, and then click Reset View to return to the
default chart view for all sales territories.

9. In the Sales Scorecard area, expand All to view the sales performance in each country or region.
10. In the Sales Report report, expand each category to see monthly revenue.

11. Close Internet Explorer and close the Dashboard Designer.


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 12-23

Lab: Implementing a SharePoint Server BI Solution


Scenario
You have created a data warehouse and analytical data model for Adventure Works Cycles, and
implemented managed reporting with Reporting Services. Executives at Adventure Works Cycles have
requested that you provide an easy way for them to see critical business information at a glance. You have
decided to implement a BI dashboard in SharePoint Server.

Objectives
After completing this lab, you will be able to:

Create a SharePoint Server site for BI.

Configure PerformancePoint data sources and filters.


Create a PerformancePoint scorecard.

Create PerformancePoint reports.

Create a PerformancePoint dashboard.

Lab Setup
Estimated Time: 75 Minutes

Virtual Machine: 20466C-MIA-SQLBI


User account: ADVENTUREWORKS\Student

Password: Pa$$w0rd

If you are unfamiliar with SharePoint Server 2013, it is recommended that you perform the lab using the
lab answer key instead of the high-level steps.

Exercise 1: Creating a SharePoint Server Site for BI


Scenario
You have decided to create a SharePoint subsite for business intelligence at http://mia-
sqlbi/sites/adventureworks/bi.

The main tasks for this exercise are as follows:

1. Prepare the Lab Environment

2. Enable SharePoint Publishing

3. Create a Subsite

Task 1: Prepare the Lab Environment


1. Ensure that the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then
log on to 20466C-MIA-SQBI as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. Run Setup.cmd in the D:\Labfiles\Lab12\Starter folder as Administrator. If prompted, press Enter to


confirm the deletion of a SharePoint site. The script may take a few minutes to complete.

Task 2: Enable SharePoint Publishing


1. Browse to the Adventure Works SharePoint Server site at http://mia-sql/sites/adventureworks. The
first time you do this, it may take a few minutes to open.
MCT USE ONLY. STUDENT USE PROHIBITED
12-24 Delivering BI with SharePoint PerformancePoint Services

2. Edit the site settings to activate the following features if they are not already active:

o The SharePoint Server Publishing Infrastructure site collection feature.

o The SharePoint Server Publishing site feature.

Tip: To view the site settings, use the Settings menu, which can be accessed from the Settings icon at
the upper-right of the home page next to the name of the user who is currently logged on.

Task 3: Create a Subsite


1. Add a new subsite to the contents of the Adventure Works Portal site.

2. The subsite should be based on the Business Intelligence Center enterprise template and be
accessible at http://mia-sql/sites/adventureworks/bi.

3. On the Adventure Works Portal home page, in the Quick Launch area on the left, add a link to the
new subsite with the caption BI Portal.

4. In the BI subsite. Modify the navigation settings so that only the navigation items below the current
site are displayed. This prevents the subsite from inheriting the Quick Launch area of the parent site,
and reduces confusion when navigating similarly-named document libraries in both sites.

Results: At the end of this exercise, you should have created a subsite based on the Business Intelligence
Center template at http://mia-sqlbi/sites/adventureworks/bi.

Exercise 2: Configuring PerformancePoint Data Access


Scenario
You have decided to use PerformancePoint Services to deliver BI information through SharePoint Server.
Most of the BI reports and scorecards you plan to create will be based on the Sales cube in the Analysis
Services OLAP database, so you must create a data source for it. You must also configure time intelligence
for the data connection so that users can filter data by selecting specific time periods such as this year or
last year.
The main tasks for this exercise are as follows:

1. Configure the PerformancePoint Unattended Account

2. Create a Data Source

3. Create a Filter

Task 1: Configure the PerformancePoint Unattended Account


1. Use the SharePoint 2013 Central Administration tool to configure the settings of the
PerformancePoint Services service application so that it uses the following unattended service
account:

o User Name: ADVENTUREWORKS\ServiceAcct

o Password: Pa$$w0rd

Task 2: Create a Data Source


1. In the Data Connections Library on the Adventure Works BI Portal site at http://mia-
sql/sites/Adventureworks/bi, launch the PerformancePoint Dashboard Designer.

2. Create a data source named Adventure Works OLAP that connects to the Sales cube in the
Adventure Works OLAP Analysis Services database on MIA-SQL.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 12-25

3. Configure the Adventure Works OLAP data source to use a time dimension based on the Calendar
Date hierarchy in the Order Date dimension.

4. Specify January 1, 2008 in the cube as the start date for the time dimension, and specify that this
reference member is at the Day level of the hierarchy.

5. Map the reference member above to January 1st of the current year so that PerformancePoint time
intelligence considers the 2008 data in the cube to be data for the current year.

6. Map the following dimension attributes to the time hierarchy levels in the data sources:

o Calendar Year: Year

o Calendar Semester: Semester

o Calendar Quarter: Quarter

o Month: Month

o Day: Day

Task 3: Create a Filter


1. Use the Dashboard Designer to create a time intelligence filter in the PerformancePoint Content
folder.

2. The filter should be based on the Adventure Works OLAP data source you created in the previous
task, and it should include the following formulae:

o Year (with the display name Current Year).

o Year-1 (with the display name Last Year).

3. Name the filter Sales Year.

Results: At the end of this exercise, you will have configured the credentials used by PerformancePoint
Services, created a PerformancePoint data source for the Adventure Works OLAP database, and created a
filter that will enable users to display BI data for the current year or the previous year.

Exercise 3: Creating PerformancePoint Reports


Scenario
In addition to the reseller margin scorecard, the executives at Adventure Works want to see a column
chart that shows reseller profit for each sales region, and an existing SQL Server Reporting Services report
showing monthly revenue for each sales territory.

The main tasks for this exercise are as follows:

1. Create an Analytic Chart Report

2. Create a Reporting Services Report

Task 1: Create an Analytic Chart Report


1. Use the Dashboard Designer to create an analytic chart named Reseller Profit in the
PerformancePoint Content folder.

2. Configure the chart to show the Reseller Profit measure on the bottom axis with the Sales Territory
dimension as a series.

3. Filter the series to include only the Europe, North America, and Pacific sales regions.
MCT USE ONLY. STUDENT USE PROHIBITED
12-26 Delivering BI with SharePoint PerformancePoint Services

4. Add the Calendar Date hierarchy from the Order Date dimension to the Background area of the
report, so that it can be used to filter the report by a specified time period.

5. Format the chart as a column chart with the legend shown at the top.

Task 2: Create a Reporting Services Report


1. Use the Dashboard Designer to create a Reporting Services report named Reseller Revenue in the
PerformancePoint Content folder.

2. Configure the report to use SharePoint Integrated mode and connect to the Reseller Revenue.rdl
report in the Reports document library of the http://mia-sql/sites/adventureworks SharePoint site.

3. Ensure that the report will be displayed without the toolbar or parameters pane, and that the Year
parameter uses its default value.

Results: At the end of this exercise, you will have created an analytic chart named Reseller Profit, and a
PerformancePoint report, based on SQL Server Reporting Services, named Reseller Revenue.

Exercise 4: Creating a PerformancePoint Scorecard


Scenario
The executives at Adventure Works Cycles have requested the ability to see a visual indication of how the
company is performing against its reseller sales profit margin goal in each territory. A KPI for this goal
exists in the Sales cube, so you must import that KPI into the PerformancePoint Content folder and
display it in a scorecard.

The main tasks for this exercise are as follows:

1. Create a Scorecard
2. Edit a KPI

3. Edit a Scorecard

Task 1: Create a Scorecard


1. Use the Dashboard Designer to create an Analysis Services scorecard in the PerformancePoint
Content folder.

2. The scorecard should be based on the Adventure Works OLAP data source, and it should import the
Reseller Margin KPI from Analysis Services.

3. Name the scorecard Reseller Scorecard.

Task 2: Edit a KPI


1. In the Dashboard Designer, view the Reseller Margin KPI that you have imported from Analysis
Services.

2. Change the number format for each element of the KPI so that all values are displayed as
percentages.

Task 3: Edit a Scorecard


1. Modify the Reseller Scorecard so that it displays the Reseller Margin KPI for each of the following
members of the Sales Territory dimension:

o Europe

o North America
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 12-27

o Pacific

2. Modify the Goal and Status metric so that only the Score value is displayed.

3. Hide the Trend column, so that it is not shown when the scorecard is displayed in SharePoint Server.

4. Modify the scorecard settings so that the toolbar is visible when the scorecard is displayed in
SharePoint Server.

Results: At the end of this exercise, you will have added a KPI named Reseller Margin and a scorecard
named Reseller Scorecard to the Dashboard Designer.

Exercise 5: Creating a PerformancePoint Dashboard


Scenario
You have created all the BI elements required by the executives at Adventure Works Cycles. Now you
must combine these elements into a dashboard and publish it to the BI subsite you created in SharePoint
Server.

The main tasks for this exercise are as follows:

1. Create a Dashboard

2. Create Connections

3. Deploy a Dashboard

4. Browse a Dashboard

Task 1: Create a Dashboard


1. Create a two-column dashboard named Reseller Dashboard.

2. Add the Sales Year filter, the Reseller Profit report, and the Reseller Scorecard scorecard to the left
column.

3. Add the Reseller Revenue report to the right column.

4. Deploy the dashboard to the Dashboards folder in the Adventure Works BI Portal site.
5. Make the dashboard the home page for the Adventure Works BI Portal site.

6. Explore the dashboard and verify that the chart and scorecard provide interactive functionality.

Task 2: Create Connections


1. In the Reseller Dashboard you created in the previous task, create a connection in the Reseller Profit
report that connects to the Sales Year filter. The connection should map the Order Date.Calendar
Date hierarchy in the report to the Member Unique Name property of the filter.

2. Create a connection in the Reseller Scorecard scorecard that connects to the Sales Year filter. The
connection should map the TI Formula of the scorecard to the Formula property of the filter.

3. Create a connection in the Reseller Revenue report that connects to the Sales Year filter. The
connection should map the Year parameter of the report to the Display Value property of the filter.

Task 3: Deploy a Dashboard


1. Save the dashboard workspace as Reseller Workspace in the D:\Labfiles\Lab12\Starter folder.

2. Deploy the dashboard to the Dashboards folder in the Adventure Works BI Portal site.

3. Make the dashboard the home page for the Adventure Works BI Portal site.
MCT USE ONLY. STUDENT USE PROHIBITED
12-28 Delivering BI with SharePoint PerformancePoint Services

4. View the dashboard, noting the information it contains. Note that, at smaller screen resolutions, you
may need to scroll within each dashboard element to see all the data.

Task 4: Browse a Dashboard


1. Browse the dashboard you have created, and verify that selecting a value in the Sales Year filter
updates all the elements to reflect the selected time period (in the context of 2008 being the current
year).

2. View the tooltips for the columns in the Reseller Profit report, and verify that you can drill down into
sales territories by clicking columns in the chart. Note that you can return to the default view by
clicking Reset View in the drop-down list for the dashboard element.
3. Verify that you can drill down into sales territories in the Reseller Margin scorecard, to see KPIs for
each level of the hierarchy.

4. Verify that you can drill down into sales territories in the Reseller Revenue report to see monthly
revenue for each sales region.

Results: At the end of this lab, you will have created a dashboard named Reseller Dashboard and
published it to SharePoint Server.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 12-29

Module Review and Takeaways


This module explored how SharePoint Server can be used to add value to an overall BI project by
providing a centralized platform for storing BI applications and content. It explored the various farm
topology options that can be implemented to meet the organizations performance, scalability, and
availability requirements. It also outlined which BI component should be installed at each layer of the farm
topology when it is implemented. Exploration of Kerberos provided information about the necessary
setup if the ability to audit is an important part of the business requirement.

The module then concluded with an exploration of the capabilities of Reporting Services, PowerPivot, and
PerformancePoint Services in a SharePoint farm and how this could be centralized in a single subsite to
provide a one-stop shop for an organizations BI platform.

Review Question(s)
Question: Now you are familiar with the capabilities that SharePoint Server brings to a BI
project, what considerations would there be for implementing SharePoint as part of a BI
project in your organization?
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
13-1

Module 13
Performing Predictive Analysis with Data Mining
Contents:
Module Overview 13-1

Lesson 1: Overview of Data Mining 13-2

Lesson 2: Creating a Data Mining Solution 13-8

Lesson 3: Validating a Data Mining Model 13-12

Lesson 4: Consuming Data Mining Data 13-17

Lab: Using Data Mining to Support a Marketing Campaign 13-21

Module Review and Takeaways 13-26

Module Overview
Data mining enables you to use the data you own to gain insights that can help you make intelligent
decisions about your business. Microsoft SQL Server 2014 Analysis Services includes data mining tools
you can use to identify patterns in your data, helping you to determine why particular things happen and
to predict what will occur in the future. This module introduces data mining, describes how to create a
data mining solution, how to validate data mining models, how to use the Data Mining Add-ins for
Microsoft Excel, and how to incorporate data mining results into Reporting Services reports.

Objectives
After completing this module, you will be able to:

Describe the key data mining concepts and use the Data Mining Add-ins for Excel.

Create a data mining solution.

Validate data mining models.

Use data mining data in a report.


MCT USE ONLY. STUDENT USE PROHIBITED
13-2 Performing Predictive Analysis with Data Mining

Lesson 1
Overview of Data Mining
Data mining is a special kind of data analysis that involves using statistical models to reveal connections
and correlations in large sets of data that would otherwise be very difficult or even impossible to identify.

Lesson Objectives
This lesson explains the purpose of data mining, and describes the components of a data mining solution,
and the algorithms used to build prediction models. The lesson also introduces the Data Mining Add-ins
for Excel, which you can use to perform desktop data mining analysis.

After completing this lesson, you will be able to:

Describe the purpose of data mining.

Describe the components of an Analysis Services data mining solution.

Describe the different types of data mining algorithms you can use in Analysis Services data mining
solutions.

Use the Data Mining Add-ins for Excel to perform table analysis.

What Is the Purpose of Data Mining?

Revealing Hidden Patterns and Trends


Exploration of the data in a data warehouse has
the potential to reveal patterns and trends that
can be useful to organizations in many ways.
These include enabling them to predict customer
behavior, or to provide customers with product
recommendations in real time as they shop online.
However, the size and complexity of data in large
data warehouses can make it very difficult to
derive useful information from raw data by using
standard data analysis tools and techniques. Data
mining is the statistical analysis of large volumes of data that would be very difficult to analyze manually.
With data mining, business users can overcome the problems associated with analyzing large and
complex data sets to access useful and actionable information.

Data Mining Algorithms


Data mining involves using algorithms to search through data to extract patterns and trends. For example,
a retail organization might want to discover if customers in large urban areas are more likely to buy goods
in a high price range, compared to customers living in more rural locations. The organization could use
data mining to determine any correlations between location and price, how strong the relationships might
be, or how multiple correlating factors affect outcomes when they are considered together. There are
various data mining algorithms you can use, depending on the type of questions you want to discover the
answers to.

Data mining can help you to identify trends and patterns that may not be immediately obvious. You can
use data mining to predict unknown values based on statistics and patterns in previously-analyzed sets of
data. This is useful when you are trying to predict and plan for future events.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 13-3

Data Mining Scenarios


Forecasting. Companies can use data mining to analyze sales patterns to determine, for example,
when a particular product will sell or how stores will profit over time.

Targeted advertising. Organizations can create target marketing and advertising campaigns by
identifying the factors that best predict the customers who are most likely to purchase a given
product, and targeting the campaign at those individuals.

Recommendations. Companies can recommend merchandise to customers as they shop online by


analyzing their purchasing histories and previous sales of products.

Risk assessment. Insurance companies can assess the likelihood of a claim being fraudulent by
employing data mining algorithms that use the outcomes of previous claims to create a weighting for
each factor affecting the new claim. Credit ratings agencies can use financial and customer history
data to predict which individuals are most likely to default on a loan.

Components of an Analysis Services Data Mining Solution


To create a data mining solution in SQL Server
2014 Analysis Services, you must install Analysis
Services in Multidimensional and Data Mining
mode. Tabular mode and PowerPivot for Microsoft
SharePoint mode do not support data mining.
You then need to create a data mining structure, a
case table, and a data mining model.

Data Mining Structure


The data mining structure is the central
component of a data mining solution. It performs
several functions, including:

Contains the data source view from which the data to be mined is derived. This view is based on a
data source that connects to either a relational or an Analysis Services multidimensional database.

Defines the case table and the mining structure columns that, in turn, express the data and content
types.

Specifies a training set. Different portions of the data are used to train data mining models and
testing.

Defines as a percentage the proportion of the data to use in the testing set.

Contains one or more data mining models that use the same data source view and case table.

Case Table
The case table stores the source data for data mining models. When you specify the case table, you define
the data type and content type for each column. Content types include discrete, continuous, and cyclical.

Discrete. These are values that are not part of a sequence. Discrete columns contain data that has a
finite number of values with no continuum between them. Examples of discrete data include number
of children, phone number, or gender. Discrete values can be numeric or non-numeric.

Continuous. These represent sequences of numeric data on a scale. Examples of continuous data
include temperature or weight.

Cyclical. These represent data that is organized into limited, ordered sets that repeat. Examples of
cyclical data include numbered days of the week or numbered months of the year.
MCT USE ONLY. STUDENT USE PROHIBITED
13-4 Performing Predictive Analysis with Data Mining

Note: For more information about Content Types (Data Mining), go to


http://go.microsoft.com/fwlink/?LinkID=246796.

Data Mining Model


Each data mining model uses a data mining algorithm, which you specify when you create the model. The
mining model uses the algorithm to analyze data from the data mining structure. When you create a data
mining model, you define the columns from the data mining structure, and specify a usage value for each
column. Columns from the data mining structure for which you do not specify a usage value are not
included in the model. The available usage values are:
Key. This indicates a key column containing values that identify each row uniquely.

Predictable. This indicates the column for which you want to predict values in the mining mode.

Input. This indicates that the model should use this column to help forecast values for the predictable
column.

For example, if you want to predict the amount of money each customer is likely to spend on a
supermarket website, you could use CustomerID as the key column, CustomerSpend as the predictable
column, and various others, containing data such as address, age, and number of children, as input
columns.

Analysis Services Data Mining Algorithms


Data mining algorithms provide rules for the
analysis of data. SQL Server 2014 Analysis Services
provides nine algorithms that you can use to
create data mining models. You can also add
further third-party plug-in algorithms if necessary.
Widely-used data mining algorithms fall into the
following broad categories:

Classification algorithms. These predict one


or more discrete variables based on other
attributes. Microsoft Decision Trees is an
example that might be used to forecast
whether a customer will purchase a particular
product. Microsoft Neural Network and Microsoft Naive Bayes are other classification algorithms.

The correct algorithm for your purposes depends on a number of factors, such as the volume of data
or the specific types of column being analyzed. For example, the Nave Bayes algorithm can use
discrete columns (such as City) to classify data but doesnt support continuous columns that may be
grouped into ranges (such as Age). If you need to classify data into groups based on ranges of
continuous values, the Decision Trees algorithm may be a better choice.

Regression algorithms. These predict one or more continuous variables, such as profit or loss. The
Microsoft Time Series algorithm is an example that might be used to determine a retail stores
seasonal sales for the coming year. Microsoft Linear Regression and Microsoft Logistic Regression can
also predict continuous variables.

Segmentation or clustering algorithms. These divide data into groups or clusters of items that have
similar properties. Microsoft Clustering, for example, might be used to divide customers into groups
with similar purchasing habits or preferences.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 13-5

Association algorithms. These find correlation between different attributes in a data set. The
Microsoft Association algorithm is an example that might be used to describe which items such as
products being purchased together are likely to appear in a transaction.

Sequence analysis algorithms. Sequence analysis finds common sequences in data. The Microsoft
Sequence Clustering algorithm is an example that might be used to find common web clickthrough
paths or the order of placing items in a cart.

Note: For more information about Data Mining Algorithms (Analysis Services, Data
Mining), go to http://go.microsoft.com/fwlink/?LinkID=246797.

Note: For more information about Plugin Algorithms, go to


http://go.microsoft.com/fwlink/?LinkID=246798.

Data Mining Add-ins for Excel


The Data Mining Add-ins for Excel are packaged in
free, downloadable extensions to Excel enabling
you to perform a wide range of data mining
analyses using the familiar Microsoft Office
interface. There are two add-ins in the package:

Data Mining Client for Excel. This add-in


helps you to prepare data mining data, build
and validate models, and manage data mining
models. You can also use the Data Mining
Client for Excel to browse and query data
mining models. You can use data in the Excel
worksheet or external data stored in an
Analysis Services database.
Table Analysis Tools for Excel. This add-in allows you to perform table analyses, such as analyzing
key influencers, identifying exceptions in data sets, and shopping basket analysis. Table analysis tools
are designed so they can be used by individuals with no understanding of data mining or Analysis
Services principles.

The table analysis tools include:

Analyze Key Influencers. This tool enables you to identify the factor that most strongly influences a
particular outcome. For example, if you want to identify the characteristics that make customers most
likely to purchase a specific product, you can use the Analyze Key Influencers tool to correlate
information such as customer location, age, and number of children, with a column that records
whether the customer purchased the product in question.

Detect Categories. This tool enables you to find columns that strongly correlate to create new
categories. For example, the tool might discover a correlation between the 45-55 age group and the
$75,000-plus income bracket. You can then use this information to create a category for a marketing
campaign.

Fill From Example. This tool uses sample values to provide missing values for the rows in a column.
Missing values can impair the usefulness of other data mining tools.

Forecast. This extrapolates future values from existing trends in time series data to create predictions.
For example, you can use the Forecast tool to predict sales figures for the coming year.
MCT USE ONLY. STUDENT USE PROHIBITED
13-6 Performing Predictive Analysis with Data Mining

Highlight Exceptions. This tool identifies rows that do not match the patterns found in the majority
of rows in the data set. You can then analyze these rows further, or exclude them from the data set.

Scenario Analysis. This tool enables you to evaluate the effects of proposed scenarios. With the Goal
Seek scenario, you can identify the factors you must change to meet a desired goal. The What-If
scenario enables you to assess the effects of a change before you implement it. For example, you
could predict how an increase in product prices could affect sales.

Prediction Calculator. This tool accepts a target value for a specific column, and then correlates it
with values in other columns to identify the most common patterns. The tool presents results in the
form of a scorecard.

Shopping Basket Analysis. This tool enables you to perform cross-selling analysis. For example, you
can use the tool to identify products that frequently sell together, and use this to generate
recommendations to customers who are browsing your website.

Demonstration: Performing Table Analysis in Excel


In this demonstration, you will see how to:

Enable the Excel Data Mining Add-Ins.

Use the Excel Table Analysis Tools.

Demonstration Steps
Enable the Excel Data Mining Add-Ins

1. Ensure that the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then
log on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. In the D:\Demofiles\Mod13 folder, run Setup.cmd as Administrator.

3. Start Excel and create a new blank workbook.

4. On the ribbon, click File. Then click Options.

5. In the Excel Options dialog box, on the Add-Ins page, in the Manage list, select COM Add-Ins and
click Go.
6. In the COM Add-Ins dialog box, if SQLServer.DMClientXLAddin and SQLServer.DMXLAddin are
not selected, select them both. Then click OK.

7. Close Excel.

Use the Excel Table Analysis Tools

1. Restart Excel, and open the Table Analysis.xlsx workbook in the D:\Demofiles\Mod13 folder.

2. Select any cell in the table of data, and on the ribbon, in the Table Tools group, click the Analyze
tab, and then click the icon in the Connection area.

3. In the Analysis Services Connections dialog box, click New, in the Connect to Analysis Services
dialog box, in the Server name field, type localhost, in the Catalog name drop-down list, click
DMAddinsDB, and then click OK.

4. In the Analysis Services Connections dialog box, click Close.

5. On the ribbon, click Analyze Key Influencers.

6. In the SQL Server Data Mining - Analyze Key Influencers dialog box, in the Column Selection
drop-down list, click Purchased Bike, and then click Run.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 13-7

7. In the SQL Server Data Mining Discrimination based on key influencers dialog box, in the
Compare Value 1 drop-down list, click Yes, and then in the to Value 2 drop-down list, click No.

8. Click Add Report, and then click Close.

9. Review the Key Influencers Report for Purchased Bike report. Note the values that most strongly
correlate with a customer purchasing a bike.

10. Close Excel, and do not save changes to Table Analysis.xlsx.


MCT USE ONLY. STUDENT USE PROHIBITED
13-8 Performing Predictive Analysis with Data Mining

Lesson 2
Creating a Data Mining Solution
To produce a data mining solution in Analysis Services, you first create a model that describes your
business problem. The model is then trained by running your data through algorithms that generate a
mathematical model of it. You can then either visually explore the mining model or create prediction
queries against it. Analysis Services data mining structures can use datasets from both relational and
multidimensional databases, and there is a set of algorithms you can use to investigate that data in
various ways.

This lesson describes how to create a data mining structure and data mining model. The lesson also
describes how to edit data models and use Data Mining Add-ins for Excel to create a data mining model.
Finally, the lesson introduces the Data Mining Extensions (DMX) language, which you can use to work with
data mining in Analysis Services.

Lesson Objectives
After completing this lesson, you will be able to:

Create a data mining structure and a data mining model.


Edit a data mining structure and model.

Creating Data Mining Solutions


In SQL Server Data Tools, you can use the Data
Mining Wizard to create new data mining
structures and models. You can then use the Data
Mining Designer in SQL Server Data Tools to
configure the structure. You can also use the Data
Mining Client for Excel add-in to create and edit
data mining structures and models.

Data Mining Wizard


You can use the Data Mining Wizard to define the
data mining structure and, optionally, to create
the first data mining model for the structure. The
Data Mining Wizard starts automatically when you
create a new data mining structure. When you run the Data Mining Wizard, you can configure the data
source view, the case table, the Key, Input and Predictable columns for the training data, the column data
types and content types, and the proportion of the data to use for the testing set. If you choose to create
a data mining model at the same time as a data mining structure, you must also specify the mining
technique for the model to use.

Data Mining Designer


You can use the Data Mining Designer to configure the data mining structure and data mining model that
you created with the Data Mining Wizard. For example, you can use the Data Mining Designer to add new
data mining models, train, browse, and compare models, and to create predictions based on data mining
models.

Data Mining Client for Excel


You can use the Data Mining Client for Excel add-in to create data mining structures and models. You can
also use the add-in to edit models and structures you created by using SQL Server Data Tools. The add-in
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 13-9

includes dedicated wizards for creating the more commonly-used data mining models, which make the
process much simpler for inexperienced users.

Demonstration: Creating a Data Mining Solution


In this demonstration, you will see how to:

Create a Data Mining Project in Visual Studio.

Create a Data Mining Structure and a Data Mining Model.

Demonstration Steps
Create a Data Mining Project in Visual Studio

1. Ensure you have completed the previous demonstration in this module.

2. Start Visual Studio and on the File menu, click New, and click Project.
3. In the New Project dialog box, click Analysis Services Multidimensional and Data Mining Project,
in the Name field, type Mine AW, in the Location field, browse to D:\Demofiles\Mod13, and then
click OK.
4. In Solution Explorer, right-click Data Sources, and then click New Data Source.

5. In the Data Source Wizard, on the Welcome to the Data Source Wizard page, click Next, and then
on the Select How to Define the Connection page, click New.

6. In the Connection Manager dialog box, in the Server name field, type MIA-SQL, in the Select or
enter a database name drop-down list, click AdventureWorksDW, and then click OK.

7. In the Data Source Wizard, on the Select How to Define the Connection page, click Next, on the
Impersonation Information page, click Use a specific Windows user name and password, and
enter the following credentials and click Next:

o User name: ADVENTUREWORKS\ServiceAcct


o Password: Pa$$w0rd

8. On the Completing the Wizard page, click Finish.

9. In Solution Explorer, right-click Data Source Views, and then click New Data Source View.

10. In the Data Source View Wizard, on the Welcome to the Data Source View Wizard page, click
Next.

11. On the Select a Data Source page, ensure that Adventure Works DW is selected, and then click
Next.

12. On the Select Tables and Views page, in the Available objects list, click ProspectiveBuyer (dbo),
hold the Ctrl key and click vTargetMail (dbo), and click the > button to move the selected objects
to the Included objects list. Then click Next.

13. On the Completing the Wizard page, in the Name field, type AW DW View, and then click Finish.
MCT USE ONLY. STUDENT USE PROHIBITED
13-10 Performing Predictive Analysis with Data Mining

Create a Data Mining Structure and a Data Mining Model

1. In Solution Explorer, right-click Mining Structures, click New Mining Structure, and then in the
Data Mining Wizard, on the Welcome to the Data Mining Wizard page, click Next.

2. On the Select the Definition Method page, ensure that From existing relational database or data
warehouse is selected, and then click Next.

3. On the Create the Data Mining Structure page, ensure that Create mining structure with mining
model is selected, ensure that Microsoft Decision Trees is selected, and then click Next.

4. On the Select a Data Source View page, select AW DW View and click Next.
5. On the Specify Table Types page, in the vTargetMail row, select the check box in the Case column,
and then click Next.

6. On the Specify the Training Data page, in the Mining Model Structure table, select the following
columns and click Next:

o BikeBuyer: Predictable

o CustomerKey: Key

o All other columns: Input

7. On the Specify Columns Content and Data Type page, click Detect and review the content type
and data types found. Ensure that the content type of the Bike Buyer column is identified as
Discrete. Then click Next.

8. On the Create Testing Set page, note that the Percentage of data for testing value is 30 percent,
and then click Next.
9. On the Completing the Wizard page, in the Mining structure name field, type Purchase Prediction,
in the Mining model name field type Purchase Decision Tree, and then click Finish.

10. On the Build menu, click Deploy Mine AW.


11. When deployment is complete, close Visual Studio.

Editing Data Mining Structures and Models

Data Mining Designer


You can use the Data Mining Designer in Visual
Studio to edit data mining structures, add and edit
data mining models, test the accuracy of data
models, and view model data.
To add models to an existing data mining
structure, use the following procedure:

1. In Visual Studio, on the Mining Models tab,


click the Create a related mining model
button, or right-click an existing mining
model, and then click New Mining Model.

2. In the New Mining Model dialog box, enter a model name and select an algorithm for the new
model.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 13-11

Data Mining Client for Excel


You can also use the Data Mining Client for Excel add-in to edit data mining structures and models. When
you use the Add Model To Structure Wizard in the Data Mining Client for Excel add-in to edit an existing
structure by adding a new model, you can select and configure a data mining algorithm. The Manage
Mining Structures And Models dialog box enables you to rename, delete, clear, process, export, and
import structures and models.

Demonstration: Modifying a Data Mining Structure


In this demonstration, you will see how to:

Connect to a Data Mining Model in Excel.

Add a Model to a Data Mining Structure.

Demonstration Steps
Connect to a Data Mining Model in Excel

1. Ensure you have completed the previous demonstrations in this module.


2. Start Excel and create a new blank document.

3. On the ribbon, on the Data Mining tab, in the Connection area (or drop-down list, depending on
the screen resolution), click the connection icon (which is labeled with the name of the last
connection used or <No Connection> if no connections have been used previously), and then in the
Analysis Services Connections dialog box, click New.

4. In the Connect to Analysis Services dialog box, in the Server name field, type MIA-SQL, in the
Catalog name drop-down list, select Mine AW, and then click OK.

5. In the Analysis Services Connections dialog box, click Close.

Add a Model to a Data Mining Structure


1. On the ribbon, in the Data Modeling area, click Advanced, and then click Add Model to Structure.

2. In the Add Model to Structure Wizard, on the Getting Started with the Add Model to Structure
Wizard page, click Next.

3. On the Select Structure or Model page, ensure that the Purchase Prediction structure is selected,
and then click Next.

4. On the Select Mining Algorithm page, in the Algorithm drop-down list, click Microsoft Naive
Bayes, and then click Next.

5. On the Select Columns page, in the Bike Buyer row, in the Usage column, click Predict Only, in the
Name Style row, in the Usage column, click Do not use, and then click Next.

6. On the Finish page, clear the Browse Model check box and click Finish.

7. On the ribbon, in the Management section, click Manage Models to verify that the model has been
added to the data mining structure, and then click Close.

8. Close Excel without saving the workbook.


MCT USE ONLY. STUDENT USE PROHIBITED
13-12 Performing Predictive Analysis with Data Mining

Lesson 3
Validating a Data Mining Model
Validation is the process of assessing how well your mining models perform against real data. It is
important to validate your mining models by understanding their quality and characteristics before you
deploy them into a production environment.

This lesson provides an overview of data mining model validation and describes the criteria for validating
models. It also introduces the tools for model validation that are provided in SQL Server Data Tools and
the Data Mining Client for Excel add-in.

Lesson Objectives
After completing this lesson, you will be able to:

Explain the need for model validation.

Describe the main validation criteria.


Describe the tools that you can use to validate data mining models

Overview of Data Mining Validation


Data mining algorithms use statistical models to
predict the outcome of defined scenarios.
However, if you use multiple algorithms to analyze
a single data set, the results will not be identical.
This is because each algorithm uses different
techniques to analyze data. Consequently, when
you analyze data using different algorithms, you
need to test the results to discover which
algorithm is the most accurate for the given
scenario. This process is known as validation.

When you create a data model, you use a set of


data called the training set to build the model.
You also define a proportion of the data to use to test the model this is known as the testing set. To test
the validity of a data mining model, you create prediction queries to use against the testing set, and then
use validation tools to compare the results of the mining model to this data. You can view the results of
validation tests in accuracy charts, which compare predicted values with actual values.

Other factors can affect the validity of a data model, including the quality of the data and the usefulness
of the results to the business.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 13-13

Criteria for Validating Data Mining Models


To assess the validity of a data mining model, you
must consider the three criteria of accuracy,
reliability, and usefulness.

Accuracy
Data mining models correlate a defined outcome,
such as the amount of money a customer will
spend, with input criteria, such as customer age,
location, number of items purchased, and so on.
The accuracy of these correlations can be affected
by missing, approximate, or incorrect data. It is
important to assess the accuracy of a model so
you know how much you can trust the results
produced. If you are aware in advance that a certain percentage of the data contains inaccuracies, you can
decide to accept this degree of inaccuracy in the models you create. Data that qualifies as highly accurate
is not necessarily reliable or useful. For example, you might test a model that predicts future sales for a
store based on previous performance. The model might be very accurate because the correlations are
strong. However, if the method of calculating profitability used in the model is incorrect, the data
produced is neither reliable nor useful.

Reliability
To be regarded as reliable, a data mining model must perform consistently. To assess reliability, you need
to test the data model by using multiple, similar data sets. For each data set, the model should produce
similar results. For example, you could test a model that predicts future sales performance for a store by
using it against data from multiple stores. If one store calculates profitability differently, this will show up
in the testing results.

Usefulness
A data model might be both accurate and reliable, but this does not necessarily mean that the results it
produces are useful to the business. For example, your model might identify a strong correlation between
the profitability of a store and its location, but this might not be useful because it doesnt explain why that
correlation exists.

Tools for Validating Data Mining Models


Visual Studio, the Data Mining Add-In for Excel,
and SQL Server Management Studio include tools
for validating data mining models.

Visual Studio
You can use the Mining Accuracy Chart tab in
the Data Mining Designer, in the SQL Server Data
Tools for BI add-in for Visual Studio, to create
charts and reports enabling you to assess the
validity of data models.
MCT USE ONLY. STUDENT USE PROHIBITED
13-14 Performing Predictive Analysis with Data Mining

Lift chart. This is a graphical representation of the difference between the results produced by a data
mining model and those that random guessing would generate. The difference between these two
sets of results is called the lift. You can compare multiple models simultaneously by using a lift chart,
which enables you to easily identify the best model. For example, if you are creating a mailing
campaign, you can create a lift chart to identify the model that best predicts the customers most
likely to respond. Typically, a lift chart is at its best when used with continuous data values.

Profit chart. This is a graph that displays the estimated profit or loss identified by the model for a
particular scenario. When you create a profit chart, you specify values for the number of rows to use
in the assessment, the overall cost of the scenario, the cost per row, and the anticipated revenue per
row. For example, in a mailing campaign, you could specify to use 5,000 customer rows, the total cost
of the campaign, the cost per customer, and the revenue per customer. These values are used to
calculate the campaigns profitability.

Classification matrix. This uses testing data to count and display the number of true and false
positives that a data mining model identifies for a given predicted value. For example, imagine a
Customers table that includes a BikeBuyer column containing values of 1 and 0. The value 1
indicates that the customer purchased a bike, and 0 that they did not. A classification matrix will use
the input columns to calculate whether a customer will be a bike buyer or not, and then compare
these values to the real values in the data. The matrix will display the results, showing the number of
correct and incorrect predictions for each value. Typically, a classification matrix is more accurate
when used with discrete data values.
Cross-validation report. Lift charts, profit charts, and classification matrices are all types of chart that
assess and display the accuracy of models in different ways. A cross-validation report is a different
way of validating a data mining model. To perform cross-validation, Analysis Services first divides a
data set into partitions. It then uses a model to analyze one of the partitions, treating this as the
training set. The other partitions are used to validate the results of this analysis. The process is then
repeated several times, using a different partition as the training set on each occasion.

Data Mining Client for Excel


The Data Mining Client for Excel enables you to validate data mining models using a set of wizards, which
you can access from the Accuracy and Validation area of the Data Mining tab on the ribbon. The Accuracy
Chart Wizard enables you to create a lift chart, the Classification Matrix Wizard enables you to create a
classification matrix, the Profit Chart Wizard enables you generate a profit chart, and the Cross-Validation
wizard enables you to perform cross validation.

SQL Server Management Studio


You can use SQL Server Management Studio to connect to an Analysis Services database containing a
data mining structure, and perform the following validation tasks for models:

Create a lift or profit chart.

Create a classification matrix.

Create a cross-validation report.


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 13-15

Demonstration: Validating Data Mining Models


In this demonstration, you will see how to:

View a Lift Chart.

View a Profit Chart.


View a Classification Matrix.

View a Cross-Validation Report.

Demonstration Steps
Create a Lift Chart

1. Ensure you have completed the previous demonstrations in this module.

2. Start SQL Server Management Studio, and when prompted, connect to the MIA-SQL instance of
Analysis Services by using Windows authentication.

3. In Object Explorer, expand Databases, expand Mine AW, and expand Mining Structures. Then
right-click the Purchase Prediction mining structure and click View Lift Chart.
4. On the Input Selection tab, note that both mining models are selected with the Bike Buyer column
as predictable, and that the test cases defined in the models themselves will be used for the
validation.
5. Click the Lift Chart tab, and view the lift chart, which compares accuracy for the two mining models
you have created against an ideal model by plotting the number of correct predications against the
number of cases in the overall sample. As the number of cases increases, the ideal model maintains an
accuracy of 100 percent, but the models you have created tend to become less accurate the more
cases there are.

6. Review the scores in the Mining Legend pane to see which of your models is the most accurate for
this test data.

Create a Profit Chart

1. In the Chart type drop-down list, select Profit Chart.


2. In the Profit Chart Settings dialog box, enter the following values to reflect a marketing campaign
you are planning, and then click OK:

o Population: 20,000 (this is the number of potential customers you plan to contact).
o Fixed cost: 1,000 (this is the fixed cost of your marketing campaign).

o Individual cost: 3 (this is the cost associated with contacting each customer).

o Revenue per individual: 10 (this is the amount you expect a customer to spend if they respond
positively to the campaign).

3. Review the chart and the Mining Legend pane to evaluate which mining model is likely to generate
the most profitable marketing campaign based on the test data.

Create a Classification Matrix

1. Click the Classification Matrix tab.

2. Review the matrix, noting that for each model it shows the number of times the model predicted a
Bike Buyer value of 1 or 0 on rows, with columns for the actual value of the Bike Buyer column in
the test data.

Create a Cross Validation Report


MCT USE ONLY. STUDENT USE PROHIBITED
13-16 Performing Predictive Analysis with Data Mining

1. Click the Cross Validation tab.

2. Enter the following values, and click Get Results.

o Fold Count: 5 (this is the number of partitions used to group the data for analysis).

o Max Cases: 5 (this is the number of cases to be analyzed).

o Target Attribute: Bike Buyer (this is the predictable column to be evaluated).

o Target State: 1 (this is the desired value for the target attribute).

o Target Threshold: 0.1 (this is a value between 0 and 1 that indicates the level of accuracy
required for a prediction to be considered correct).

3. View the resulting report, and note that for each mining model, the results include the following:

o Classifications for true positives, false positives, true negatives, and false negatives.

o The likely lift gained by using the model.


4. Minimize SQL Server Management Studio as you will use it in the next demonstration.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 13-17

Lesson 4
Consuming Data Mining Data
Results produced by data mining models need to be displayed in useful and comprehensible ways for the
benefit of data and business analysts.

This lesson describes the options for viewing data mining results and how you can use Reporting Services
reports to render them in a user-friendly format.

Lesson Objectives
After completing this lesson, you will be able to:

Describe the tools for viewing data mining results.

Explain the purpose of DMX.

Explain the benefits of using Reporting Services to display data mining results.

Viewing Data Mining Results

Mining Model Viewer


You can view the results of data mining analyses in
Visual Studio by using the Mining Model Viewer in
the Data Mining Designer. The viewing options
available vary depending on the type of model
you want to view, for example, you can use the
Microsoft Tree Viewer to browse a decision tree
model. The Microsoft Tree Viewer includes the
Decision Tree and Dependency Network tabs to
enable you to view results in different graphical
formats.

In the Mining Model Viewer, you can refresh data model views and, depending on the viewer used,
perform other tasks, such as filter items, or adjust thresholds.

Browse Models by Using the Data Mining Client for Excel Add-In
You can also browse mining model results by using the Data Mining Client for Excel add-in. You can
browse a model by clicking Browse in the Model Usage area of the ribbon and then specifying the model
you want to view. As with the Mining Model Viewer, the way results are displayed varies depending on
the type of model.
MCT USE ONLY. STUDENT USE PROHIBITED
13-18 Performing Predictive Analysis with Data Mining

Introduction to DMX
Analysis Services includes the DMX query
language, which you can use to work with data
mining models. DMX is an extension of Transact
Structured Query Language (Transact-SQL) and
can be used to create the structure of new data
mining models, to train these models, and to
browse, manage, and use them to generate
predictions.

To help you build DMX prediction queries, SQL


Server Management Studio includes a query
builder tool and DMX templates you can use as
the basis for new queries. In the SQL Server Data
Tools environment, you access the query builder tool from the Mining Model Prediction tab of Data
Mining Designer.

You can use the result of any DMX query as the basis of a report. By taking advantage of the
parameterization and formatting features available in Reporting Services, you can deliver data mining
results in a format that is easy to consume.

DMX is composed of Data Definition Language (DDL) statements, Data Manipulation Language (DML)
statements, and a set of functions and operators.

DDL Statements
You use DDL statements to create and delete data mining structures and models, and to import and
export data mining structures and models. DDL statements include CREATE, ALTER, DROP, IMPORT,
EXPORT, and SELECT INTO statements.

The following code example creates a new mining model that uses the Microsoft Naive Bayes algorithm.
The Bike Buyer column is defined as the predictable attribute.

Code to Create a Data Mining Model


CREATE MINING MODEL [NBSample]
(
CustomerKey LONG KEY,
Gender TEXT DISCRETE,
[Number Cars Owned] LONG DISCRETE,
[Bike Buyer] LONG DISCRETE PREDICT
)
Using Microsoft_Naive_Bayes

DML Statements
You use DML statements to train and browse mining models, and create predictions against them. DML
statements include SELECT statements, SELECT statements with PREDICTION JOIN clauses, and INSERT
INTO statements for training data mining models.

The following code example returns the midpoint, minimum age, and maximum age for all the values in
the Age column.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 13-19

Code to Query a Data Mining Model


SELECT DISTINCT [Age] AS [Midpoint Age],
RangeMin([Age]) AS [Minimum Age],
RangeMax([Age]) AS [Maximum Age]
FROM [TM Decision Tree]

Additional Reading: For more information, view the Data Mining Extensions (DMX)
Reference, go to http://go.microsoft.com/fwlink/?LinkID=246800.

Demonstration: Querying Data Mining Models


In this demonstration, you will see how to:

Browse a Data Mining Model.

Query a Data Mining Model.

Demonstration Steps
Browse a Data Mining Model

1. Ensure you have completed the previous demonstrations in this module.


2. Maximize SQL Server Management Studio, and in Object Explorer, right-click the Purchase
Prediction data mining structure and click Browse.

3. In the Mining Model drop-down list, ensure that Purchase - Bayes is selected, and on the
Dependency Network tab, move the slider gradually from All Links to Strongest Links to see which
factors are the strongest predictors that a customer will purchase a bike.

4. On the Attribute Profiles tab, view the color-coded indicators of the values for each column when
compared to customers with a Bike Buyer value of 1 or 0.

5. On the Attribute Characteristics tab, in the Attribute drop-down list, ensure that Bike Buyer is
selected, and in the Value drop-down list, select 1. Then view the probability for each other column
value when the Bike Buyer value is 1.

6. On the Attribute Discrimination tab, in the Attribute drop-down list, ensure that Bike Buyer is
selected, in the Value drop-down list, select 1, and in the Value 2 drop-down list, select 0. Then note
how values for all other columns favor a particular Bike Buyer value.

7. In the Mining Model drop-down list, select Purchase Decision Tree, and on the Decision Tree tab,
in the Background drop-down list, select 1. Then view the decision tree to see how the other column
values influence a value of 1 for Bike Buyer.

Query a Data Mining Model

1. In Object Explorer, right-click the Purchase Prediction mining structure and click Build Prediction
Query.

2. In the query designer, in the Mining Model pane, click Select Model. Then in the Select Mining
Model dialog box, expand Purchase Prediction, click Purchase - Bayes, and click OK.

3. In the Select Input Table(s) pane, click Select Case Table. Then in the Select Table dialog box, click
ProspectiveBuyer (dbo), and click OK.

4. Under the Mining Model pane, in the Source column, select ProspectiveBuyer table, and then in
the Field column, select EmailAddress.
MCT USE ONLY. STUDENT USE PROHIBITED
13-20 Performing Predictive Analysis with Data Mining

5. Under the row you just added, in the Source column, click Purchase Bayes mining model, in the
Field column, click Bike Buyer, and in the Criteria/Argument column, type =1.

6. Under the row you just added, in the Source column, click Prediction Function, in the Field column,
click PredictProbability, in the Alias column, type Purchase Probability, and then drag the Bike
Buyer column from the Purchase Bayes model in the Mining Model pane to the
Criteria/Argument column so it contains the value [Purchase - Bayes].[Bike Buyer].

7. On the Mining Model menu, click Query to view the DMX code that has been generated.

8. On the Mining Model menu, click Result to view the query results. The query returns the email
address of every prospective customer who is predicted to buy a bike, along with the probability
(expressed as a percentage in fraction format) that this forecast is accurate.

9. Close SQL Server Management Studio without saving any changes.

Using Data Mining Data in Reporting Services Reports


You have seen how DMX queries can be used to
generate predictive information from a data
mining model. To provide a way of viewing data
mining results that is easier for business users to
consume, you can create a Reporting Services
report that uses DMX to query the data mining
mode and provide predictive information.
Reporting Services reports offer a wide range of
display and formatting options that enable you to
show results in the most intuitive way. You can
also incorporate the results of multiple models
into a single report for convenience.

Data mining models store data in an Analysis Services database. To connect to an Analysis Services
database to create a data mining report, you create an Analysis Services connection in a Reporting
Services project, in SQL Server Data Tools. You can then use Query Builder to create the query that
retrieves the data to use in the report.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 13-21

Lab: Using Data Mining to Support a Marketing Campaign


Scenario
The marketing department at Adventure Works Cycles is planning a direct mail campaign.

In order to maximize its effectiveness, you have been asked to create a report that uses data mining
techniques to identify the subset of potential customers who are most likely to purchase a bike.

Objectives
After completing this lab, you will be able to:

Use table analysis tools in Excel.

Create a data mining structure.

Add a data mining model to a data mining structure.

Validate data mining models.

Use a data mining model as a data source for a report.

Estimated Time: 75 minutes

Virtual machine: 20466C-MIA-SQL

User name: ADVENTURWORKS\Student

Password: Pa$$w0rd

Exercise 1: Using Table Analysis Tools


Scenario
Marketing managers at Adventure Works want to create a mailing list that targets the individuals in their
data warehouse who are most likely to make a bike purchase. You will first use the Data Mining Add-in for
Excel to identify the key factors that correlate with previous purchases of bikes by using data from the
data warehouse.

The main tasks for this exercise are as follows:

1. Prepare the Lab Environment

2. Enable the Data Mining Add-Ins in Excel

3. Perform Table Analysis

Task 1: Prepare the Lab Environment


1. Start the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines, and then log on to 20466C-MIA-
SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. Run Setup.cmd in the D:\Labfiles\Lab13\Starter folder as Administrator.

Task 2: Enable the Data Mining Add-Ins in Excel


1. Start Excel and create a new blank workbook.

2. Enable the SQLServer.DMClientXLAddin and SQLServer.DMXLAddin COM add-ins.

3. Close Excel.
MCT USE ONLY. STUDENT USE PROHIBITED
13-22 Performing Predictive Analysis with Data Mining

Task 3: Perform Table Analysis


1. Open the Customer Data For Data Mining.xlsx Excel workbook in D:\Labfiles\Lab13\Starter.

2. Create a new data mining connection to the DMAddinsDB catalog on the MIA-SQL server.

3. Use the Analyze Key Influencers tool to identify the key influencers on purchasing a bike.

o Use the Purchased Bike column as the column to analyze for key factors.

o Include a discrimination report that compares the value Yes to the value No.

4. Review the report, and then close the Excel, saving your changes.

Results: After this exercise, you should have created a Key Influencers report in Excel.

Exercise 2: Creating a Data Mining Structure


Scenario
Now you have an idea of the key influencers on bike purchasing, you want to create a data mining model
that will enable you to identify the individuals most likely to buy one. You will use SQL Server Data Tools
to create the model, and then deploy the model to Analysis Services.

The main tasks for this exercise are as follows:

1. Create a Data Mining Project

2. Create a Data Mining Structure and a Data Mining Model

Task 1: Create a Data Mining Project


1. Use Visual Studio to create an Analysis Services multidimensional and data mining project named AW
Data Mining. Save the project in the D:\Labfiles\Lab13\Starter folder.

2. Create a data source named Adventure Works DW that connects to the AdventureWorksDW
database on the MIA-SQL Analysis Services instance. Use the following impersonation credentials:

o User name: ADVENTUREWORKS\ServiceAcct

o Password: Pa$$w0rd

3. Use the Adventure Works DW data source to create a data source view named Adventure Works
DW DM View that includes the ProspectiveBuyer table and the vTargetMail view.

Task 2: Create a Data Mining Structure and a Data Mining Model


1. Use the Data Mining Wizard to create a data mining structure that uses the Microsoft Decision
Trees algorithm.

2. Specify the vTargetMail view from the Adventure Works DW DM View data view as the case table.

3. Configure the columns in the case table as follows:

o BikeBuyer: Predictable

o CustomerKey: Key

o All other columns: Input

4. Have the wizard detect the content and data types of the columns. Ensure that the content type of
the Bike Buyer column is identified as Discrete, and change the Yearly Income column to Discrete.

5. Create a testing set with 30 percent of the data.


MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 13-23

6. Name the data mining structure Purchase Prediction, and name the data mining model Purchase
Decision Tree.

7. Deploy the completed model to the MIA-SQL Analysis Services instance, and then close Visual Studio.

Results: After this exercise, you should have created a data mining structure and a data mining model.

Exercise 3: Adding a Data Mining Model to a Data Mining Structure


Scenario
You want to create a second model in the Purchase Prediction data mining structure that uses the Naive
Bayes algorithm. You will use the Data Mining Add-in for Excel to create the new model and review it.

The main tasks for this exercise are as follows:

1. Add a Model to the Data Mining Structure


2. Review the Data Mining Model

Task 1: Add a Model to the Data Mining Structure


1. In a new blank Excel workbook, create a new data mining connection to the AW Data Mining
database on the MIA-SQL Analysis Services instance.

2. Use the advanced data modeling support to add a model to the Purchase Prediction structure:

o Use the Microsoft Naive Bayes algorithm.

o Specify Bike Buyer as a Predict Only column, Customer Key as a Key column, and do not use
the Age, Birth Date, Date First Purchase, Geography Key, or Name Style columns. All
remaining columns should be specified as Input columns.

3. On the final page of the wizard, name the model Purchase Bayes, and select the options to process
and browse it.

Task 2: Review the Data Mining Model


1. Review the data mining model you created in the previous task by viewing the Dependency
Network, Attribute Profiles, Attribute Characteristics, and Attribute Discrimination tabs of the
model browser.

Note: In the Bike Buyer column, the value 1 indicates that the customer is a bike buyer, and the value 0
indicates that the customer is not. When reviewing the Naive Bayes data mining model, you will need to
enter the correct values for the Bike Buyer column on the Attribute Characteristics tab, and the
Attribute Discrimination tab.

2. Close the browser but keep Excel open for the next exercise.

Results: After this exercise, you should have created a Naive Bayes data mining model.

Exercise 4: Validating Data Mining Models


Scenario
Now you have created the data mining models, you want to validate them to see which is the most
accurate for your data. To do this, you will use the Data Mining Add-in for Excel.

The main tasks for this exercise are as follows:


MCT USE ONLY. STUDENT USE PROHIBITED
13-24 Performing Predictive Analysis with Data Mining

1. Create an Accuracy Chart

2. Create a Classification Matrix

3. Create a Profit Chart

Task 1: Create an Accuracy Chart


1. Use the Data Mining tools in Excel to view an accuracy chart for the Purchase Prediction data
mining structure. The chart should show accuracy for predicting a Bike Buyer value of 1 from the test
data in the mining structure.

2. Review the accuracy chart to determine the relative accuracies of each model when used to predict
bike buyers in a given population of prospective customers.

Note: If the chart does not show all the data, select it and, on the Design tab of the ribbon, click Select
Data. Then in the Select Data Source dialog box, click OK.

Task 2: Create a Classification Matrix


1. Use the Data Mining tools in Excel to view a classification matrix for the Purchase Prediction data
mining structure. The matrix should show percentages of correct and incorrect predictions of a Bike
Buyer value from the test data in the mining structure.

2. Review the classification matrix to determine how often the mining models correctly predicted a
value of 1 or 0 for the Bike Buyer column when compared to the test data.

Task 3: Create a Profit Chart


1. Use the Data Mining tools in Excel to create a profit chart for the Purchase Prediction data mining
structure. The chart should show the estimated profit generated by each model when predicting a
Bike Buyer value of 1 for a target population of 5,000 with a fixed cost of 500.00, and individual cost
of 1.00, and a revenue per individual of 150.00 when using the test data in the mining structure.

2. Review the profit chart to determine predicted profitability for each model.
Note: If the chart does not show all the data, select it and, on the Design tab of the ribbon, click Select
Data. Then in the Select Data Source dialog box, click OK.

3. Save the workbook as DM Validation.xlsx in the D:\Labfiles\Lab13\Starter folder, and then close
Excel.

Results: After this exercise, you should have validated the data mining models by using the Data Mining
Add-in for Excel.

Exercise 5: Using a Data Mining Model in a Report


Scenario
Now you have validated the data mining models, you want to create a report that contains the list of
potential bike purchasers, listed from most likely to purchase to least likely. You will create a Reporting
Services report that uses data from the AW Data Mining database and then format the report.
The main tasks for this exercise are as follows:

1. Create a report

Task 1: Create a report


1. Use Visual Studio to create a new Report Server Project Wizard project named Promotion
Targeting in the D:\Labfiles\Lab13\Starter folder.
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 13-25

2. Create a new data source for the AW Data Mining database in the MIA-SQL instance of Microsoft
SQL Server Analysis Services.

3. Use the query builder to create a query that uses the Purchase Bayes model and the
ProspectiveBuyer(dbo) case table. Configure the query to return the following fields:

Source Field Alias Criteria/Argument

ProspectiveBuyer FirstName

ProspectiveBuyer LastName

ProspectiveBuyer Address Line 1

ProspectiveBuyer City

Purchase Bayes Bike Buyer =1

Prediction Function PredictProbability Purchase [Purchase - Bayes].[Bike


Probability Buyer]

4. Create a tabular report with all fields in the details section, format it using any style, and name it
Potential Bike Buyers.

5. Sort the report from Z to A by the Purchase Probability column and change the format of the
Purchase Probability column to percentage.
6. Preview the Potential Bike Buyers report, and modify its formatting until you are happy with it.

Results: After this exercise, you should have created a report that predicts bike purchasers.
MCT USE ONLY. STUDENT USE PROHIBITED
13-26 Performing Predictive Analysis with Data Mining

Module Review and Takeaways


In this module, you have learned how to use the data mining capability of SQL Server Analysis Services for
predictive analysis.

Review Question(s)
Question: How have you seen, or can you envisage, data mining being used in organizations
where you have worked?
MCT USE ONLY. STUDENT USE PROHIBITED
Implementing Data Models and Reports with Microsoft SQL Server 13-27

Course Evaluation

Your evaluation of this course will help Microsoft understand the quality of your learning experience.

Please work with your training provider to access the course evaluation form.
Microsoft will keep your answers to this survey private and confidential and will use your responses to
improve your future learning experience. Your open and honest feedback is valuable and appreciated.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
L1-1

Module 1: Introduction to Business Intelligence and


Data Modeling
Lab: Exploring a BI Solution
Exercise 1: Exploring the Data Warehouse
Task 1: Prepare the Lab Environment
1. Ensure that the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then
log on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. In the D:\Labfiles\Lab01\Starter folder, right-click Setup.cmd and then click Run as administrator.

3. Click Yes when prompted to confirm that you want to run the command file, and then wait for the
script to finish.

Task 2: Explore the Data Warehouse Schema


1. Start SQL Server Management Studio and connect to the MIA-SQL instance of the database engine.
Use Windows authentication.

2. In Object Explorer, expand Databases, and then expand AdventureWorksDW.

3. Right-click Database Diagrams and click New Database Diagram.

4. In the Add Table dialog box, click DimAccount, hold the Shift key and click FactSurveyResponse,
and click Add. Then click Close.

5. Explore the diagram, noting the following details:

o The FactInternetSales table stores details of sales orders made through the Adventure Works
website. The table is related to the DimCustomers table, which contains details of the customers
who have placed orders.

o The FactResellerSales table stores details of sales orders made to resellers. The table is related to
the DimResellers table, which contains details of the resellers who have placed orders.

o The DimDate table stores values for dates, including details of the calendar and fiscal periods in
which individual dates occur.

o The FactInternetSales and FactResellerSales tables are related to the DimDates table by
multiple fields. These fields represent the order date (the date the order was placed), the due
date (the date the order was expected to be in stock), and the ship date (the date the order was
shipped to the customer).

o Both the DimCustomer and DimReseller tables are related to the DimGeography table, which
stores details of geographical locations. The DimSalesTerritory table is also related to
DimGeography.

o The DimProduct table contains details of products, and is related to the


DimProductSubcategory table, which is in turn related to the DimProductCategory table.

o Many tables include multiple language values for the same data value, for example, the
DimDates table stores the English, French, and Spanish words for each month name.

o The DimEmployee table is related to itself to represent the fact that each employee has a
manager, who is also an employee.

6. On the File menu, click Save Diagram_0, and save the diagram as AdventureWorksDW Schema.
MCT USE ONLY. STUDENT USE PROHIBITED
L1-2 Implementing Data Models and Reports with Microsoft SQL Server

Task 3: Query the Data Warehouse


1. Open the Query DW.sql script file in the D:\Labfiles\Lab01\Starter folder.

2. Select the Transact-SQL code under the comment Internet sales by year and month, and click
Execute.

3. View the results of the query, and note the following details:

o The data warehouse contains Internet sales orders from July 2005 to July 2008. Reseller sales in
the data warehouse are also recorded for this time period.

o You can use the MonthNumberOYear column in the DimDate table to sort month names into
chronological orderwithout this field, it would be difficult (though not impossible) for reporting
clients to sort months other than alphabetically. A similar field named DayNumberOfWeek can
be used to sort week day names into chronological order.

4. Select the Transact-SQL code under the comment Geographical reseller sales, and click Execute.

5. View the results of the query, and note the following details:

o In 2005, Adventure Works only sold to resellers in the United States and Canada.

o In 2006, this was expanded to include France and the United Kingdom.

o In 2007, resellers in Australia and Germany were added.

o By contrast, Adventure Works has sold directly to Internet customers in all of these regions since
2005.
6. Select the Transact-SQL code under the comment Sales by product category, and click Execute.

7. View the results of the query, and note the following details:

o Adventure Works sells four categories of product: Accessories, Bikes, Clothing, and Components.

o Components are only sold to resellers not to Internet customers.

o Accessories were not sold to Internet customers until 2007.

Results: At the end of this exercise, you will have explored the data warehouse for the BI solution, and
used Transact-SQL queries to explore the data it contains.

Exercise 2: Exploring the Analysis Services Data Model


Task 1: View an Analysis Services Database
1. In SQL Server Management Studio, in Object Explorer, in the Connect drop-down list, click Analysis
Services.

2. Connect to the MIA-SQL Analysis Services server.

3. In Object Explorer, under the MIA-SQL Microsoft Analysis Services server, expand Databases. Then
expand Adventure Works OLAP.
4. Under the Adventure Works OLAP database, expand Data Sources. Note that the Analysis Services
database includes a data source named Adventure Works Data Warehouse, which connects to the
AdventureWorksDW database you explored in the previous exercise.
5. Under the Adventure Works OLAP database, expand Cubes. Then right-click Internet Sales and
click Browse.
MCT USE ONLY. STUDENT USE PROHIBITED
L1-3

6. In the cube browser pane, in the Metadata pane, expand Measures, and Internet Sales Order. Then
drag the Sales Amount measure onto the empty query pane. The total Internet sales amount is
displayed.

7. In the Metadata pane, expand Order Date, and then drag Order Date.Fiscal Year to the query
pane. The sales revenue is now aggregated by fiscal year. In Adventure Works, fiscal years run from
July to June, so fiscal year 2006 represents the period from July 2005 to June 2006.

8. Close SQL Server Management Studio without saving any files.

Task 2: Create a PivotTable in Excel


1. Start Microsoft Excel and create a new blank workbook.

2. On the Data tab, in the Get External Data area or list, in the From Other Sources list, click From
Analysis Services.

3. In the Data Connection Wizard dialog box, in the Server name box, type MIA-SQL. Then ensure
that Use Windows Authentication is selected and click Next.

4. On the Select Database and Table page, select the Adventure Works OLAP database and the
Internet Sales cube. Then click Next.
5. On the Save Data Connection File and Finish page, click Finish.

6. In the Import Data dialog box, select PivotTable Report, ensure that Existing worksheet is selected
with the cell reference =$A$1, and click OK.
7. In the PivotTable Fields pane, under Internet Sales Order, select Sales Amount. The total revenue
from Internet sales is displayed in the PivotTable on the worksheet.

8. In the PivotTable Fields pane, in the Values area, in the drop-down list for the Sales Amount field,
click Value Field Settings.

9. In the Value Field Settings dialog box, click Number Format.

10. In the Format Cells dialog box, select Accounting, and click OK. Then, in the Value Field Settings
dialog box, click OK. The Internet sales total in the PivotTable on the worksheet is formatted as a
currency value.

11. In the PivotTable Fields pane, under Order Date, select Order Date.Calendar Date. The total
calendar years for the sales are displayed as columns.

12. In the PivotTable Fields pane, drag Order Date.Calendar Date from the Columns area to the Rows
area. The years are now displayed on rows.
13. In the PivotTable, expand 2006. The calendar semesters for the year are shown. Then expand
H1CY2006 to reveal the first two quarters, expand Q1 CY 2006 to reveal the first three months, and
expand January to reveal daily sales totals in January 2006.
14. Save the workbook as Sales.xlsx in the D:\Labfiles\Lab01\Starter folder.

Task 3: Filter the PivotTable


1. In Excel, in the Sales.xlsx workbook, click any cell in the PivotTable on the Sheet1 worksheet. Then on
the ribbon, in the Analyze tab, click Insert Slicer.

2. In the Insert Slicers dialog box, under Product, in the Product Category hierarchy, select English
Product Category Name. Then click OK.

3. Move the slicer so that you can see the PivotTable. Note that the Components category is disabled
because there are no sales of components to Internet customers.
MCT USE ONLY. STUDENT USE PROHIBITED
L1-4 Implementing Data Models and Reports with Microsoft SQL Server

4. In the slicer, click Accessories. The PivotTable is filtered to show only sales of accessories. Note that
there were no accessories sold in 2005 or 2006.

5. In the slicer, click the Clear Filter icon to remove the filter.

6. Save the workbook and close Excel.

Results: At the end of this exercise, you will have used Excel to explore an analytical data model built on
the data warehouse.

Exercise 3: Exploring Reports


Task 1: View Reports
1. Start Internet Explorer and browse to http://localhost/sites/adventureworks.

2. In the Adventure Works Portal site, in the Quick Launch area on the left, click Reports.

3. In the Reports folder, click Sales Trends, and wait for the report to be rendered. This report shows
historic Internet sales by product category.

4. In the parameters pane, change the Calendar Year parameter to 2007 and click Apply.

5. In the report, expand Bikes and wait for the report to update and show monthly sales for bikes.
6. Expand January to see details of the sales orders placed for bikes in January 2007.

7. At the top of the page, click Reports to return to the Reports folder.

8. In the Reports folder, click US Sales By State and wait for the report to be rendered. This report
shows annual sales revenue in the United States on a map.

9. At the top of the page, click Reports to return to the Reports folder.

Task 2: Export a Report


1. In the Reports document library, click Sales Report and wait for the report to be rendered.
2. Click the border on the left side of the Parameters pane to hide it, and note that the report includes
a chart that shows monthly sales for each product category.

3. At the top of the page, click the Next Page button and wait for the page to be rendered. Note that
this page shows sales for January and includes expandable groupings you can use to drill down to see
sales details for individual products.

4. In the Actions menu, point to Export, and click Excel. When prompted, save the report in the
D:\Labfiles\Lab01\Starter folder and then open it.

5. In Excel, note that the workbook includes a summary worksheet containing a chart and a worksheet
for each month.

6. On the January worksheet, note that you can use native Excel functionality to expand and collapse
the data groupings.

7. Close Excel without saving any changes to the workbook and return to the report in Internet
Explorer.

8. In the Actions menu, point to Export, and click Word. When prompted, save the report in the
D:\Labfiles\Lab01\Starter folder and then open it.
MCT USE ONLY. STUDENT USE PROHIBITED
L1-5

9. In Word, note that the first page of the report shows the chart and that the sales details for each
month start on a new page. Note also that this report has been designed to identify when it is being
rendered in a non-interactive format and automatically expand the data groupings.

10. Close Word without saving any changes and close Internet Explorer.

Results: At the end of this exercise, you will have viewed data in reports and exported a report to Excel
and Word formats.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
L2-1

Module 2: Creating Multidimensional Databases


Lab: Creating a Multidimensional Database
Exercise 1: Creating a Data Source
Task 1: Prepare the Lab Environment
1. Ensure that the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then
log on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. In the D:\Labfiles\Lab02\Starter folder, right-click Setup.cmd, and then click Run as administrator.

3. Click Yes when prompted to confirm that you want to run the command file, and then wait for the
script to finish.

Task 2: Create an Analysis Services Project


1. Start Visual Studio, and on the File menu, point to New, and then click Project.

2. In the Templates list, select Business Intelligence and then click Analysis Services
Multidimensional and Data Mining Project.
3. Change the value in the Name box to Adventure Works OLAP.

4. Click Browse, browse to the D:\Labfiles\Lab02\Starter folder, and then click Select Folder.

5. Click OK.

Task 3: Create a Data Source


1. In Solution Explorer, right-click the Data Sources folder, and then click New Data Source.

2. On the Welcome to the Data Source Wizard page, click Next.

3. On the Select how to define the connection page, click New.

4. In the Connection Manager dialog box, in the Server name box, type localhost.

5. In the Log on to the server area, ensure that Use Windows Authentication is selected.

6. In the Connect to a database area, in the Select or enter a database name box, click
AdventureWorksDW, and then click OK.

7. On the Select how to define the connection page, click Next. On the Impersonation Information
page, select Use a specific Windows user name and password.
8. In the User name box, type ADVENTUREWORKS\ServiceAcct.

9. In the Password box, type Pa$$w0rd, and then click Next.

10. On the Completing the Wizard page, change the data source name to Adventure Works Data
Warehouse, and then click Finish.

Results: After this exercise, you should see the Adventure Works Data Warehouse.ds data source in the
Data Sources folder.
MCT USE ONLY. STUDENT USE PROHIBITED
L2-2 Implementing Data Models and Reports with Microsoft SQL Server

Exercise 2: Creating and Modifying a Data Source View


Task 1: Create a Data Source View
1. In Solution Explorer, right-click the Data Source Views folder, and then click New Data Source
View.

2. On the Welcome to the Data Source View Wizard page, click Next.

3. On the Select a Data Source page, verify that the Adventure Works Data Warehouse data source
is selected, and then click Next.

4. In the Available objects list, click DimCustomer (dbo), and then hold down the Ctrl key and click
DimDate (dbo), DimGeography (dbo), DimProduct (dbo), DimProductCategory (dbo),
DimProductSubcategory (dbo), and FactInternetSales (dbo). Then click the right arrow (>) button
to add the selected tables to the Included objects list, and click Next.

5. On the Completing the Wizard page, change the name of the data source view to Adventure
Works DSV, and then click Finish. The Data Source View Designer opens automatically.

Task 2: Modify a Data Source View


1. In the Data Source View Designer, use the Zoom icon to change the zoom level to 50%.
2. In the diagram, click the title bar of the FactInternetSales table and press F4.

3. In the Properties pane, change the FriendlyName property to Internet Sales.

4. Click the DimCustomer table, and change its FriendlyName property to Customer.
5. Change the FriendlyName property of the remaining tables to remove the Dim prefix and add
spaces between words.

6. Right-click the Customer table, and then click New Named Calculation.

7. In the Create Named Calculation dialog box, in the Column name box, type Full Name.

8. In the Expression box, type the text in the following code example:

CASE
WHEN MiddleName IS NULL THEN
FirstName + ' ' + LastName
ELSE
FirstName + ' ' + MiddleName + ' ' + LastName
END

9. In the Create Named Calculation dialog box, click OK.

10. On the File menu, click Save All.

Results: After this exercise, you have created a data source view named Adventure Works DSV.dsv.
MCT USE ONLY. STUDENT USE PROHIBITED
L2-3

Exercise 3: Creating and Modifying a Cube


Task 1: Create a Cube
1. In Solution Explorer, right-click the Cubes folder, and then click New Cube.

2. On the Welcome to the Cube Wizard page, click Next.

3. On the Select Creation Method page, verify that Use existing tables is selected, and then click
Next.

4. On the Select Measure Group Tables page, click Suggest.

5. Note that the wizard selects the Internet Sales table as the measure group table, and then click
Next.

6. On the Select Measures page, clear every check box except Order Quantity, Total Product Cost,
Sales Amount, and Internet Sales Count, and click Next.

7. On the Select New Dimensions page, clear the Internet Sales check box, and then click Next.

8. On the Completing the Wizard page, change the Cube name to Sales, and then click Finish. The
Cube Designer opens automatically.

Task 2: Edit Measures


1. In the Measures pane of the cube designer, expand the Internet Sales measure group, right-click
the Order Quantity measure, and then click Rename.

2. Rename Order Quantity to Internet Order Quantity.


3. In the Measures pane, right-click the Total Product Cost measure, and then click Rename.

4. Rename Total Product Cost to Internet Cost.

5. In the Measures pane, right-click Sales Amount, and then click Rename.

6. Rename Sales Amount to Internet Revenue.

7. On the File menu, click Save All.

Task 3: Edit Dimensions


1. In Solution Explorer, under Dimensions, right-click Customer.dim, and then click View Designer.
2. In the Data Source View pane, in the Geography table, click City.

3. Hold down the Ctrl key, click StateProvinceName, EnglishCountryRegionName, and PostalCode,
and then drag the selected columns to the Attributes pane.

4. In the Data Source View pane, in the Customer table, click CustomerAlternateKey.

5. Hold down the Ctrl key, click Title, FirstName, MiddleName, LastName, and Full Name.

6. Drag the selected columns to the Attributes pane.

7. On the File menu, click Save All. Then close the Customer.dim dimension designer.

8. In Solution Explorer, under Dimensions, right-click Product.dim, and then click View Designer. In
the Data Source View pane, in the Product table, click ProductAlternateKey. Then hold down the
Ctrl key, click EnglishProductName and ListPrice, and drag the selected columns to the Attributes
pane.

9. In the Data Source View pane, in the Product Subcategory table, click
EnglishProductSubcategoryName, and drag the selected column to the Attributes pane.
MCT USE ONLY. STUDENT USE PROHIBITED
L2-4 Implementing Data Models and Reports with Microsoft SQL Server

10. In the Data Source View pane, in the Product Category table, click EnglishProductCategoryName,
and drag the selected column to the Attributes pane.

11. On the File menu, click Save All. Then close the Product.dim dimension designer.

12. In Solution Explorer, under Dimensions, right-click Date.dim, and then click View Designer.

13. In the Data Source View pane, in the Date table, click FullDateAlternateKey. Then hold down the
Ctrl key, click EnglishMonthName, MonthNumberOfYear, CalendarQuarter, CalendarYear, and
CalendarSemester and drag the selected columns to the Attributes pane.

14. In the Attributes pane, right-click English Month Name and click Rename. Then change the
attribute name to Month.

15. On the File menu, click Save All. Keep the Date.dim designer open for the next task.

Task 4: Browse the Cube


1. In Solution Explorer, right-click the Adventure Works OLAP solution, and then click Deploy. If an
Account Password dialog box appears, in the Password field type Pa$$w0rd and click OK. Then
wait for the Deploy Succeeded message in the status bar.

2. When deployment has completed successfully, in the Sales.cube cube designer, click the Browser
tab.

Tip: Click the Auto Hide icon on the various Visual Studio panes to make it easier to see the entire cube
browser window.

3. In the Measure Group pane, expand Measures, expand Internet Sales, and then drag the Internet
Revenue measure to the Drag levels or measures here to add to the query area.

4. In the Measure Group pane, drag Internet Sales Count to the right of the Internet Revenue
column.

5. In the Measure Group pane, expand the Order Date dimension, and drag the Order Date.Calendar
Year attribute to the left of the Internet Revenue column. The cube browser shows sales amounts
and counts for multiple years.

6. In Visual Studio, on the File menu, click Save All.

7. Keep Visual Studio open for the next exercise.

Results: After this exercise, you should have successfully created and deployed a cube named Sales.cube.

Exercise 4: Adding a Dimension


Task 1: Add a Table to a Data Source View
1. In Visual Studio, in the Adventure Works OLAP project, in Solution Explorer, under Data Source
Views, double-click Adventure Works DSV.dsv to open it.

2. In the Data Source View menu, click Add/Remove Tables.

3. In the Add/Remove Tables dialog box, in the Available objects list, select DimSalesTerritory and
click the right arrow (>) button. Then click OK.

4. In the diagram, click the title bar of the DimSalesTerritory table and press F4.
5. In the Properties pane, change the FriendlyName property to Sales Territory.

6. On the File menu, click Save All.


MCT USE ONLY. STUDENT USE PROHIBITED
L2-5

Task 2: Create a Dimension


1. In Solution Explorer, right-click the Dimensions folder and click New Dimension.

2. In the Dimension Wizard, on the Welcome to the Dimension Wizard page, click Next.

3. On the Select Creation Method page, select Use an existing table, and click Next.

4. On the Specify Source Information page, ensure that the Adventure Works DSV data source view
is selected. Then, in the Main table list, select Sales Territory. Verify that SalesTerritoryKey is
selected as the key column and the name column, and click Next.

5. On the Select Dimension Attributes page, select SalesTerritoryRegion, SalesTerritoryCountry,


and SalesTerritoryGroup and ensure that the Enable Browsing checkbox is selected for each of
them. Then click Next.

6. On the Completing the Wizard page, click Finish. The new dimension is opened in the dimension
designer.

Task 3: Add a Dimension to a Cube


1. In Solution Explorer, under Cubes, double-click Sales.cube to open it in the cube designer.

2. On the Cube menu, click Add Cube Dimension.

3. In the Add Cube Dimension dialog box, select Sales Territory and click OK.

4. On the Dimension Usage tab, verify that the Sales Territory dimension is related to the Internet
Sales measure group using the Sales Territory Key attribute.

5. On the File menu, click Save All.

Task 4: Analyze a Cube in Excel


1. In Solution Explorer, right-click the Adventure Works OLAP solution, and then click Deploy. If an
Account Password dialog box appears, in the Password field type Pa$$w0rd and click OK. Then
wait for the Deploy Succeeded message in the status bar.
2. When deployment has completed successfully, in the Sales.cube cube designer, on the Browser tab,
click Reconnect.

3. On the Cube menu, click Analyze in Excel. If you are prompted to enable data connections, click
Enable.

4. In the PivotTable Fields pane, beneath the Internet Sales measure, select Internet Revenue.

5. In the PivotTable Fields pane, under the Sales Territory dimension select Sales Territory Group.
Sales for each territory group are shown.

6. In the PivotTable Fields pane, under the Sales Territory dimension select Sales Territory Country.
Sales within each sales territory group are broken down for each country.

7. In the PivotTable Fields pane, under the Sales Territory dimension select Sales Territory Region.
Sales within each sales territory country are broken down for each region. Note that some countries
do not include sales regions, so the country is shown at both the country and region levels.

8. Close Excel without saving the workbook.

9. In Visual Studio, on the File menu, click Save All. Then close Visual Studio.

Results: At the end of this exercise, your database and cube should contains a Sales Territory dimension.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
L3-1

Module 3: Working with Cubes and Dimensions


Lab: Defining Dimensions
Exercise 1: Configuring Dimensions and Attributes
Task 1: Prepare the Lab Environment
1. Ensure the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then log
on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. In the D:\Labfiles\Lab03\Starter folder, right-click Setup.cmd and click Run as administrator.

3. Click Yes when prompted to confirm you want to run the command file, and wait for the script to
finish.

Task 2: Remove Unused Attributes


1. Start Visual Studio and open Adventure Works OLAP.sln in the D:\Labfiles\Lab03\Starter folder.

2. In Solution Explorer, expand Dimensions and double-click the Customer.dim dimension. Notice that
many attributes have been added to allow business users to aggregate measures in many different
ways. However, users have complained that some of these attributes are unnecessary and that they
should be removed to make browsing the cube simpler.

3. In the Attributes pane, click Commute Distance, press the Ctrl key and click Number Cars Owned
and Number Children At Home, right-click any of the highlighted attributes, and then click Delete.
Then in the Delete Objects dialog box, click OK.

4. On the File menu, click Save All. Then close the Customer.dim dimension designer.
5. In Solution Explorer, double-click the Product.dim dimension. Again, users have requested that you
remove some unnecessary attributes from this dimension.

6. In the Attributes pane, click Days To Manufacture, press the Ctrl key and click Safety Stock Level.
Then press Delete and, in the Delete Objects dialog box, click OK.

7. On the File menu, click Save All. Then close the Product.dim dimension designer.

Task 3: Add Dimension Intelligence


1. In Solution Explorer, right-click Date.dim, and then click Add Business Intelligence.

2. On the Welcome to the Business Intelligence Wizard page, click Next.

3. On the Choose Enhancement page, click Define dimension intelligence, and then click Next.

4. On the Define Dimension Intelligence page, in the Dimension type field, click Time.

5. In the Dimension attributes table, select the Include check box for the following attribute types,
and select the corresponding item in the Dimension Attribute column:

o Year: Calendar Year

o Half Year: Calendar Semester

o Quarter: Calendar Quarter

o Month: Month

o Date: Full Date Alternate Key

6. Click Next, and then click Finish.


MCT USE ONLY. STUDENT USE PROHIBITED
L3-2 Implementing Data Models and Reports with Microsoft SQL Server

Task 4: Group Attribute Members


1. In Solution Explorer, double-click the Customer.dim dimension to open it in the dimension designer.

2. In the Data Source View pane, right-click the Customer table, and then click Explore Data. Notice
the range of values for the YearlyIncome column.

3. Close the Explore Customer Table window, and then on the Dimension menu, click Process. If you
are prompted to build and deploy the project first, click Yes, and, if you are prompted to enter a
password, type Pa$$w0rd and click OK.

4. In the Process Dimension-Customer dialog box, click Run and wait for the dimension to be
processed. Then click Close, and click Close again to close the Process Dimension-Customer dialog
box.

5. On the Browser tab, click Reconnect. Then change the Hierarchy field to Yearly Income.

6. Expand All, and notice that the data is unstructured.


7. On the Dimension Structure tab, in the Attributes pane, right-click Yearly Income and click
Properties.

8. In the Properties pane, in the DiscretizationMethod box, click Automatic, in the


DiscretizationBucketCount box type 5 and, in the OrderBy list, select Key.

9. Repeat the steps you performed earlier to process and deploy the dimension. Then, on the Browser
tab, click Reconnect and verify that the yearly income is grouped into five ranges, with a sixth
member for unknown values.

Results: After this exercise, the Customer and Product dimensions have had some attributes removed,
time intelligence has been added to the Date dimension, and a Gender-Marital Status hierarchy has been
created in the Customer dimension.

Exercise 2: Creating Hierarchies


Task 1: Create a Natural Hierarchy
1. In Solution Explorer, double-click the Product.dim dimension to open it in the dimension designer.
Note that this dimension includes attributes from three related tables (Product, Product
Subcategory, and Product Category).

2. On the Dimension Structure tab, in the Attributes pane, drag English Product Category Name
into an empty area of the Hierarchies pane. This creates a new hierarchy named Hierarchy.

3. In the Attributes pane, drag English Product Subcategory Name to the <new level> area beneath
English Product Category Name in the hierarchy.

4. In the Attributes pane, drag English Product Name to the <new level> area beneath English
Product Subcategory Name in the hierarchy.

Note: If a warning is displayed, notifying you that attribute relationships do not exist and performance
may be decreased, ignore it. You will see how to use attribute relationships to optimize a hierarchy later in
this lab.

5. Right-click Hierarchy and click Rename. Then rename the hierarchy to Categorized Products.

6. In the Categorized Products hierarchy, right-click English Product Category Name and click
Rename. Then rename the hierarchy level to Category.
MCT USE ONLY. STUDENT USE PROHIBITED
L3-3

7. In the Categorized Products hierarchy, right-click English Product Subcategory Name and click
Rename. Then rename the hierarchy level to Subcategory.

8. In the Categorized Products hierarchy, right-click English Product Name and click Rename. Then
rename the hierarchy level to Product.

9. In the Attributes pane, select all the attributes by clicking the first one, and then holding Shift and
clicking the last one. Then press F4 and, in the Properties pane, set the AttributeHierarchyVisible
property to False. This hides the individual attributes, making the Categorized Products hierarchy
the only way to browse the Product dimension.

10. On the Dimension menu, click Process. If you are prompted to build and deploy the project first,
click Yes. If you are prompted to enter a password, type Pa$$w0rd and click OK.

11. In the Process Dimension-Date dialog box, click Run and wait for the dimension to be processed.
Then click Close, and click Close again to close the Process Dimension-Date dialog box.

12. On the Browser tab, ensure that the Categorized Products hierarchy is selected and expand the All
level to display the categories.

13. Expand the categories and subcategories to view individual products.

14. On the File menu, click Save All.

Task 2: Create a Non-Natural Hierarchy


1. In Solution Explorer, double-click the Customer.dim dimension to open it.
2. On the Dimension Structure tab, in the Attributes pane, drag Gender into an empty area of the
Hierarchies pane. This creates a new hierarchy named Hierarchy.

3. Right-click Hierarchy and click Rename. Then rename the hierarchy to Gender-Marital Status.
4. In the Attributes pane, drag Marital Status to the <new level> area beneath Gender in the
Gender-Marital Status hierarchy.

Note: If a warning is displayed, saying that attribute relationships do not exist and performance may be
decreased, ignore it. You will see how to use attribute relationships to optimize a hierarchy later in this
lab.

5. On the Dimension menu, click Process. If you are prompted to build and deploy the project first,
click Yes, and if you are prompted to enter a password, type Pa$$w0rd and click OK.

6. In the Process Dimension- Date dialog box, click Run and wait for the dimension to be processed.
Then click Close, and click Close again to close the Process Dimension Date dialog box.
7. On the Browser tab, ensure that the Gender-Marital Status hierarchy is selected and expand the All
level to display the gender level. Note that the gender can be F, M, or Unknown.

8. Expand F and M, and note that the marital status level value can be M or S.

9. On the File menu, click Save All.

Results: After this exercise, you should have created a Categorized Products hierarchy and a Gender-
Marital Status hierarchy.
MCT USE ONLY. STUDENT USE PROHIBITED
L3-4 Implementing Data Models and Reports with Microsoft SQL Server

Exercise 3: Create a Hierarchy with Attribute Relationships


Task 1: Configure Attribute Column Bindings
1. In Solution Explorer, double-click the Date.dim dimension to open it in the dimension designer.

2. In the Properties pane, note that the Type property for this dimension has been set to Time. This
occurred when you added dimension intelligence in the previous exercise.

3. In the Attributes pane, right-click Calendar Semester, and click Properties.

4. In the Properties pane, scroll down to the Source section, click the KeyColumns field, and then click
the ellipses () button.
5. In the Key Columns dialog box, in the Available Columns table, click Calendar Year, and then click
the right arrow (>) icon.

6. Click the up arrow icon to move CalendarYear above CalendarSemester, and then click OK.

7. In the Properties pane, click the NameColumn field, and then click the ellipses () button.

8. In the Name Column dialog box, in the Source column field, click CalendarSemester, and then click
OK.
9. In the Properties pane, click the ValueColumn field, and then click the ellipses () button.

10. In the Value Column dialog box, in the Source column field, click CalendarSemester, and then click
OK.
11. In the Attributes pane, click Calendar Quarter.

12. In the Properties pane, scroll down to the Source section, click the KeyColumns field, and then click
the ellipses () button.
13. In the Key Columns dialog box, in the Available Columns table, click Calendar Year, and then click
the right arrow (>) icon.

14. Click the up arrow icon to move CalendarYear above CalendarQuarter, and then click OK.
15. In the Properties pane, click the NameColumn field, and then click the ellipses () button.

16. In the Name Column dialog box, in the Source column field, click CalendarQuarter, and then click
OK.

17. In the Properties pane, click the ValueColumn field, and then click the ellipses () button.

18. In the Value Column dialog box, in the Source column field, click CalendarQuarter, and then click
OK.

19. In the Attributes pane, click Month, and in the Properties pane, scroll down to the Source section,
click the KeyColumns field, and then click the ellipses () button.

20. In the Key Columns dialog box, in the Key Columns table, click EnglishMonthName, and then click
the left arrow (<) icon.

21. In the Available Columns table, click Calendar Year, and then click the right arrow (>) icon.

22. In the Available Columns table, click MonthNumberOfYear, click the right arrow (>) icon. Ensure
that Calendar Year appears above MonthNumberOfYear, and then click OK.

23. In the Properties pane, click the NameColumn field, and then click the ellipses () button.

24. In the Name Column dialog box, in the Source column field, click EnglishMonthName, and then
click OK.

25. In the Properties pane, click the ValueColumn field, and then click the ellipses () button.
MCT USE ONLY. STUDENT USE PROHIBITED
L3-5

26. In the Value Column dialog box, in the Source column field, click EnglishMonthName, and then
click OK.

27. On the File menu, click Save All.

Task 2: Create Attribute Relationships


1. In the Date.dim dimension designer, click the Attributes Relationships tab.

2. In the diagram pane, right-click an empty space, and then click New Attribute Relationship.

3. In the Create Attribute Relationship dialog box, in the Source Attribute section, in the Name field,
click Full Date Alternate Key. Then, in the Related Attribute section, in the Name field, click
Month.

4. In the Relationship type field click Rigid (will not change over time), because a particular date will
always be in the same month, and then click OK.

5. Repeat the previous three steps to create the following relationships:


o Month > Calendar Quarter (Rigid)

o Calendar Quarter > Calendar Semester (Rigid)

o Calendar Semester > Calendar Year (Rigid)

6. On the File menu, click Save All.

Task 3: Create a Hierarchy


1. In the Date.dim dimension designer, on the Dimension Structure tab, in the Attributes pane, drag
Calendar Year into the Hierarchies pane. A new hierarchy named Hierarchy is created.

2. Right-click Hierarchy and click Rename then rename the hierarchy to Calendar Date.

3. In the Attributes pane, drag the following attributes one-by-one to the Calendar Date hierarchy and
drop them on to the <new level> area:
o Calendar Semester

o Calendar Quarter

o Month
o Full Date Alternate Key

4. In the Calendar Date hierarchy, right-click Full Date Alternate Key and click Rename. Then rename
the hierarchy level to Day.

5. In the Attributes pane, select Calendar Year, press the Ctrl key and click Calendar Semester,
Calendar Quarter, Month, and Full Date Alternate Key to select all of these attributes.

6. In the Properties pane, change the AttributeHierarchyVisible property to False. This defines these
attributes as member properties rather than hierarchies in their own right, so that users can only
browse them through the Calendar Date hierarchy.

7. On the Dimension menu, click Process. If you are prompted to build and deploy the project first,
click Yes, and if you are prompted to enter a password, type Pa$$w0rd and click OK.

8. In the Process Dimension-Date dialog box, click Run and wait for the dimension to be processed.
Then click Close, and click Close again to close the Process Dimension-Date dialog box.

9. On the Browser tab, ensure that the Calendar Date hierarchy is selected and expand the All level to
display calendar years. Then expand a year to reveal semesters, expand a semester to reveal months,
and expand a month to reveal individual dates.
MCT USE ONLY. STUDENT USE PROHIBITED
L3-6 Implementing Data Models and Reports with Microsoft SQL Server

10. Note that months are displayed in alphabetical order instead of chronological order.

11. On the Dimension Structure tab, in the Attributes pane, right-click the Month attribute and click
Properties.

12. In the Properties pane, in the OrderBy box, click Key.

13. Repeat the steps you performed earlier to process and deploy the dimension. Then on the Browser
tab, click Reconnect and verify that the months in the Calendar Date hierarchy are shown in
chronological order.

Results: At the end of this exercise, you will have a hierarchy named Calendar Date.

Exercise 4: Creating a Ragged Hierarchy


Task 1: Configure Attribute Column Bindings
1. In Solution Explorer, double-click Sales Territory.dim to open it in the dimension designer.
2. In the Attributes pane, right-click Sales Territory Country, and click Properties.

3. In the Properties pane, scroll down to the Source section, click the KeyColumns field, and then click
the ellipses () button.
4. In the Key Columns dialog box, in the Available Columns table, click SalesTerritoryGroup, and
then click the right arrow (>) icon.

5. Click the up arrow icon to move SalesTerritoryGroup above SalesTerritoryCountry, and then click
OK.

6. In the Properties pane, click the NameColumn field, and then click the ellipses () button.

7. In the Name Column dialog box, in the Source column field, click SalesTerritoryCountry, and then
click OK.

8. In the Properties pane, click the ValueColumn field, and then click the ellipses () button.

9. In the Value Column dialog box, in the Source column field, click SalesTerritoryCountry, and then
click OK.

10. In the Attributes pane, right-click Sales Territory Region, and click Properties.

11. In the Properties pane, scroll down to the Source section, click the KeyColumns field, and then click
the ellipses () button.

12. In the Key Columns dialog box, in the Available Columns table, click SalesTerritoryGroup and click
the right arrow (>) icon. Then click SalesTerritoryCountry and click the right arrow (>) button.

13. In the Key Columns list, select SalesTerritoryRegion and use the down arrow icon to move it to the
bottom of the list, so the list is ordered SalesTerritoryGroup, SalesTerritoryCountry, and
SalesTerritoryRegion. Then click OK.

14. In the Properties pane, click the NameColumn field, and then click the ellipses () button.

15. In the Name Column dialog box, in the Source column field, click SalesTerritoryRegion, and then
click OK.

16. In the Properties pane, click the ValueColumn field, and then click the ellipses () button.

17. In the Value Column dialog box, in the Source column field, click SalesTerritoryRegion, and then
click OK.
MCT USE ONLY. STUDENT USE PROHIBITED
L3-7

18. On the File menu, click Save All.

Task 2: Create Attribute Relationships


1. In the Sales Territory.dim dimension designer, on the Attributes Relationships tab, in the diagram
pane, right-click an empty space, and then click New Attribute Relationship.

2. In the Create Attribute Relationship dialog box, in the Source Attribute section, in the Name field,
click Sales Territory Region. Then, in the Related Attribute section, in the Name field, click Sales
Territory Country.

3. In the Relationship type field click Rigid (will not change over time) and then click OK.

4. In the diagram pane, right-click an empty space, and then click New Attribute Relationship.

5. In the Create Attribute Relationship dialog box, in the Source Attribute section, in the Name field,
click Sales Territory Country. Then, in the Related Attribute section, in the Name field, click Sales
Territory Group.
6. In the Relationship type field click Flexible (May change over time) and then click OK.

7. On the File menu, click Save All.

Task 3: Create a Hierarchy


1. In the Sales Territory.dim dimension designer, on the Dimension Structure tab, in the Attributes
pane, drag Sales Territory Group into the Hierarchies pane. A new hierarchy named Hierarchy is
created.

2. Right-click Hierarchy and click Rename. Then rename the hierarchy to Sales Territory.

3. In the Attributes pane, drag the following attributes one-by-one to the Sales Territory hierarchy
and drop them on to the <new level> area:

o Sales Territory Country


o Sales Territory Region

4. In the Attributes pane, select Sales Territory Country, press the Shift key and click Sales Territory
Region to select all the attributes in the dimension.
5. In the Properties pane, change the AttributeHierarchyVisible property to False.

6. On the Dimension menu, click Process. If you are prompted to build and deploy the project first,
click Yes, and if you are prompted to enter a password, type Pa$$w0rd and click OK.
7. In the Process Dimension-Date dialog box, click Run and wait for the dimension to be processed.
Then click Close, and click Close again to close the Process Dimension-Date dialog box.

8. On the Browser tab, ensure that the Sales Territory hierarchy is selected and expand the All level to
display sales territory groups. Then expand each group to reveal countries, and expand each country
to reveal regions.

9. Note that territories with no region level display the country name instead. Also, there is an unknown
member for members where the sales territory is unknown, but the NA value in the database is
already used for this purpose.

10. On the Dimension Structure tab, in the Attributes pane, right-click the Sales Territory dimension
at the root of the attributes tree and click Properties.

11. In the Properties pane, in the UnknownMember box, select None.


MCT USE ONLY. STUDENT USE PROHIBITED
L3-8 Implementing Data Models and Reports with Microsoft SQL Server

12. In the Attributes pane, select the Sales Territory Key attribute, and then in the Properties pane,
expand the KeyColumns property, expand Sales Territory.Sales Territory Key, and change the
NullProcessing property to Automatic.

13. In the Sales Territory hierarchy, select Sales Territory Region. Then in the Properties pane, set the
HideMemberIf property to OnlyChildWithParentName.

14. Repeat the steps you performed earlier to process and deploy the dimension. Then on the Browser
tab, click Reconnect and verify that the unknown member has been removed, and sales territories
without regions cant be expanded beyond the country level.

15. On the File menu, click Save All.

Exercise 5: Browse Dimensions and Hierarchies in a Cube


Task 1: Process the Cube
1. In Solution Explorer, under Cubes, double-click Sales.cube to open it in the cube designer.

2. On the Cube menu, click Process. If you are prompted to build and deploy the project first, click Yes,
and if you are prompted to enter a password, type Pa$$w0rd and click OK.

3. In the Process Cube-Sales dialog box, click Run.

4. In the Process Progress dialog box, when processing is complete, click Close.

5. In the Process Cube-Sales dialog box, click Close.

Task 2: Browse the Cube in Visual Studio


1. In the Sales.cube designer, click the Browser tab.

2. In the Metadata pane, expand Measures, expand Internet Sales, and drag Internet Revenue to the
query results area.

3. In the Metadata pane, expand Customer, and drag Gender-Marital Status to the query results,
placing it to the left of the Internet Revenue column. Note that Internet revenue is broken down by
every combination of gender and marital status.

4. In the Metadata pane, drag Yearly Income to the Hierarchy cell in the filter area above the query
results. Then, in the Filter Expression drop-down list, select the highest customer income band and
click OK. Note that Internet revenue values for gender and marital status combinations are filtered to
include only sales to customers in the highest income band.

Task 3: Browse the Cube in Excel


1. With the Sales cube open in the cube browser in Visual Studio, on the Cube menu, click Analyze in
Excel. If a security notice is displayed, click Enable.

2. In the PivotTable Fields pane, under Internet Sales, click Internet Revenue. The total revenue is
displayed in the PivotTable.

3. In the PivotTable Fields pane, under Order Date, click Order Date.Calendar Date. The revenue in
the PivotTable is broken down by calendar year on columns.

4. In the PivotTable, expand the second year to reveal the semesters. Then expand the semesters to
reveal quarters, expand the quarters to reveal months, and expand months to reveal individual dates.

5. In the PivotTable Fields pane, under Sales Territory, click Sales Territory. The revenue in the
PivotTable is broken down by sales territory group on rows.
MCT USE ONLY. STUDENT USE PROHIBITED
L3-9

6. In the PivotTable, expand all the sales territory groups and note that only the United States can be
expanded to individual territories.

7. Close Excel without saving the workbook. Then close Visual Studio, saving your work if prompted.

Results: At the end of this exercise, you will have tested the dimensions and hierarchies you created in the
lab.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
L4-1

Module 4: Working with Measures and Measure Groups


Lab: Configuring Measures and Measure
Groups
Exercise 1: Configuring Measures
Task 1: Prepare the Lab Environment
1. Ensure that the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then
log on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. In the D:\Labfiles\Lab04\Starter folder, right-click Setup.cmd and then click Run as administrator.

3. Click Yes when prompted to confirm that you want to run the command file, and then wait for the
script to finish.

Task 2: Create a Measure Group


1. Start Visual Studio and open the Adventure Works OLAP.sln solution in the D:\Labfiles\Lab04\Starter
folder.

2. In Solution Explorer, right-click Adventure Works DSV.dsv, and then click View Designer.
3. On the Data Source View menu, click Add/Remove Tables. Then, in the Add/Remove Tables
dialog box, in the Available objects list, select FactResellerSales (dbo) and click the > button to
add the selected table to the Included objects list.
4. Click Add Related Tables to add the tables that are related to FactResellerSales, and then click OK.
Note that the tables have been added to the data source view, and then click Save All.

5. Close the Adventure Works DSV.dsv designer.


6. In Solution Explorer, right-click Sales.cube, and then click View Designer.

7. On the Cube menu, click New Measure Group, and then in the New Measure Group dialog box,
select FactResellerSales and click OK.

8. In the Measures pane, right-click the Fact Reseller Sales measure group and click Rename. Then
change the name of the measure group to Reseller Sales.

9. Expand the Reseller Sales measure group and review the names of the measures it contains. Note
that when the Reseller Sales measure group was set up, measures were created for all of the
numerical fields in the FactResellerSales table.

Task 3: Modify measure groups


1. On the Cube menu, point to Show Measures In and click Grid to view all the measures in the cube
as a grid. Note the aggregations that are used to summarize the measures when they are analyzed
across dimensions.

2. Click the row for the Revision Number measure, and then hold the Ctrl key and click the rows for the
following measures to select them all:

o Unit Price

o Extended Amount

o Unit Price Discount Pct

o Discount Amount
MCT USE ONLY. STUDENT USE PROHIBITED
L4-2 Implementing Data Models and Reports with Microsoft SQL Server

o Product Standard Cost

o Tax Amt

o Freight

3. Click the Delete icon, and then in the Delete Objects dialog box, click OK to remove these measures
(which are not required by the business analysts who use the cube).
4. Right-click the Order Quantity measure and click Rename. Then change the name of the measure to
Reseller Order Quantity.

5. Repeat the previous step to rename the following measures:


o Total Product Cost (rename to Reseller Cost).

o Sales Amount (rename to Reseller Revenue).

o Fact Reseller Sales Count (rename to Reseller Sales Count).


6. On the Cube menu, point to Show Measures In and click Tree to view measures in the cube as a
tree. Then on the File menu, click Save All.

Results: After this exercise, you should have created a new measure group for the FactResellerSales
table, removed unrequired measures, and renamed measures.

Exercise 2: Defining a Regular Relationship


Task 1: View existing dimensions for measure groups
1. On the Build menu, click Deploy Solution. If prompted for an account password, enter Pa$$w0rd
and click OK.

2. Wait for the Deploy Succeeded message in the status bar.


3. In Solution Explorer, right-click Sales.cube and click Browse.

4. In the Metadata pane, in the Measure Group drop-down list, select Internet Sales and notice the
Customer dimension.
5. In the Measure Group drop-down list, select Reseller Sales and notice that there is no Customer
dimension, because reseller sales are sold to resellers, which are defined in a different dimension
table.

Task 2: Create a dimension


1. In Solution Explorer, right-click Dimensions and click New Dimension.

2. In the Dimension Wizard, on the welcome page, click Next.

3. On the Select Creation Method page, ensure that Use an existing table is selected, and click Next.
4. On the Specify Source Information page, in the Main table list, select DimReseller, and click Next.

5. On the Select Related Tables page, clear Geography and Sales Territory and click Next.

6. On the Select Dimension Attributes page, select only the following attributes.

o Reseller Key

o Business Type

o Reseller Name
MCT USE ONLY. STUDENT USE PROHIBITED
L4-3

Then click Next.

7. On the Completing the Wizard page, in the Name box, type Reseller. Then click Finish.

8. In the Sales.cube designer, click the Dimensions Usage tab.

9. On the Cube menu, click Add Cube Dimension. Then in the Add Cube Dimension dialog box,
select Reseller and click OK.
10. In the Dimensions Usage tab, click the Reseller Key cell at the intersection of the Reseller Sales
measure group and the Reseller dimension and click the ellipses ().

11. In the Define Relationship dialog box, note that a Regular relationship type has been detected.
Then click Cancel.

12. On the File menu, click Save All.

13. On the Build menu, click Deploy Solution. If prompted for an account password, enter Pa$$w0rd
and click OK.

14. Wait for the Deploy Succeeded message in the status bar.

15. In Solution Explorer, right-click Sales.cube and click Browse. Then on the Cube menu, click
Reconnect.

16. In the Metadata pane, in the Measure Group drop-down list, select Reseller Sales and notice the
Reseller dimension.

17. Expand Measures, expand Reseller Sales, and drag Reseller Revenue to the query results area. Then
expand Reseller and drag Business Type to the left of the Reseller Revenue column to see reseller
sales revenue broken down by business type.

Results: After this exercise, you should have added a Reseller dimension that uses a regular relationship
with the Reseller Sales measure group to enable you to analyze reseller sales data.

Exercise 3: Configuring Measure Group Storage


Task 1: Configure Proactive Caching
1. In cube designer for the Sales cube, on the Cube Structure tab, in the Measures pane, right-click
Internet Sales, and then click Properties.

2. In the Properties window, click ProactiveCaching and click the ellipses () button.

3. Under Standard setting, review each of the storage setting options. When finished, drag the slider to
Automatic MOLAP and then click OK.

4. Repeat the previous steps to configure the Reseller Sales for Automatic MOLAP proactive caching.

Task 2: Design Aggregations


1. In the Sales cube designer, click the Aggregations tab.

2. Right-click Internet Sales (0 Aggregation Designs), and then click Design Aggregations.
3. On the Welcome to the Aggregation Design Wizard page, click Next.

4. On the Review Aggregation Usage page, click Set All to Default, and then click Next.

5. On the Specify Object Counts page, click Count. Then, when the count process has completed, click
Next.
MCT USE ONLY. STUDENT USE PROHIBITED
L4-4 Implementing Data Models and Reports with Microsoft SQL Server

6. On the Set Aggregations Options page, select Performance gain reaches and ensure that the
value is set to 35%. Then click Start.

7. When the wizard has designed aggregations for a performance gain of at least 35%, click Next.

8. On the Completing the Wizard page, change the name of the aggregation to InternetSalesAgg,
select Save the aggregations but do not process them, and then click Finish.

9. Repeat the previous steps to design Reseller Sales aggregations for a 35% performance gain. Name
the aggregation design ResellerSalesAgg.

10. On the File menu, click Save All.


11. On the Build menu, click Deploy Solution. If prompted for an account password, enter Pa$$w0rd
and click OK.

12. Wait for the Deploy Succeeded message in the status bar. Then close Visual Studio.
13. Start SQL Server Management Studio. When prompted, in the Connect to Server dialog box, specify
the following settings and click Connect:

o Server type: Analysis Services

o Server name: localhost

14. In Object Explorer, expand Databases, expand Adventure Works OLAP, expand Cubes, expand
Sales, expand Measure Groups, expand Internet Sales, and expand Aggregation Designs.

15. Verify that the InternetSalesAgg aggregation design has been deployed with the cube.

16. Expand Reseller Sales and its Aggregation Designs folder to verify that the ResellerSalesAgg
aggregation design has been deployed with the cube.
17. Close SQL Server Management Studio.

Results: After this exercise, you should have defined the storage mode aggregations for the Internet
Sales and Reseller Sales measure groups.
MCT USE ONLY. STUDENT USE PROHIBITED
L5-1

Module 5: Introduction to MDX


Lab: Using MDX
Exercise 1: Creating Calculated Members
Task 1: Prepare the Lab Environment
1. Ensure that the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then
log on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. In the D:\Labfiles\Lab05\Starter folder, right-click Setup.cmd, and then click Run as administrator.

3. Click Yes when prompted to confirm that you want to run the command file, and then wait for the
script to finish.

Task 2: Use Form View to Create a Calculated Member


1. Start Visual Studio and open Adventure Works OLAP.sln in the D:\Labfiles\Lab05\Starter folder.

2. On the Build menu, click Deploy Solution. If you are prompted for impersonation credentials, enter
the password Pa$$w0rd for ADVENTUREWORKS\ServiceAcct.
3. When deployment is complete, in Solution Explorer, double-click Sales.cube to open it in the cube
designer, and then click the Calculations tab.

4. On the Cube Menu, click New Calculated Member.


5. In the Name box, change the name of the calculated measure to [Internet Profit].

6. In the Calculation Tools pane, on the Metadata tab, expand Measures, and then expand Internet
Sales to view the metadata for the Internet Sales measure group.

7. Drag Internet Revenue from the Metadata tab into the Expression box.

8. In the Expression box, after [Measures].[Internet Revenue], type a minus sign (-).

9. On the Metadata tab, drag Internet Cost into the Expression box after the minus sign (-) so that the
Expression box now contains the following MDX expression:

[Measures].[Internet Revenue]- [Measures].[Internet Cost]

10. In the Format string list, select "Currency".

11. In the Non-empty behavior list, select the check boxes for Internet Cost and Internet Revenue,
and then click OK.

12. In the Associated measure group list, select Internet Sales.

13. On the File menu, click Save All.

Task 3: Use Script View to Create a Calculated Member


1. With the Calculations tab of the Sales.cube designer open, on the Cube menu, point at Show
Calculations in and click Script.
MCT USE ONLY. STUDENT USE PROHIBITED
L5-2 Implementing Data Models and Reports with Microsoft SQL Server

2. Under the existing MDX code, add the following code:

CREATE MEMBER CURRENTCUBE.[Measures].[Reseller Profit]


AS [Measures].[Reseller Revenue]-[Measures].[Reseller Cost],
FORMAT_STRING = "Currency",
NON_EMPTY_BEHAVIOR = { [Reseller Cost], [Reseller Revenue] },
VISIBLE = 1,
ASSOCIATED_MEASURE_GROUP = 'Reseller Sales' ;

3. On the File menu, click Save All.

4. On the Build menu, click Deploy Solution. If you are prompted for impersonation credentials, enter
the password Pa$$w0rd for ADVENTUREWORKS\ServiceAcct.

5. Close Visual Studio.

Results: After this exercise, you should have created three calculated members.

Exercise 2: Querying a Cube by Using MDX


Task 1: Write simple MDX queries
1. Start SQL Server Management Studio and connect to the MIA-SQL instance of Analysis Services.

2. In Object Explorer, in the Databases folder, right-click Adventure Works OLAP, point to New
Query, and then click MDX.

3. In the query window, type the following MDX query to return the Internet profit for each year:

SELECT [Measures].[Internet Profit] ON 0,


NONEMPTY([Order Date].[Calendar Date].[Calendar Year].MEMBERS) ON 1
FROM [Sales];

4. Click Execute and review the query results.

5. Modify the query to match the following code, which includes reseller profit:

SELECT { [Measures].[Internet Profit], [Measures].[Reseller Profit] } ON 0,


NONEMPTY([Order Date].[Calendar Date].[Calendar Year].MEMBERS) ON 1
FROM [Sales];

6. Click Execute and review the query results.

Task 2: Write an MDX Query to Return Data on Rows and Columns


1. Modify the query you created in the previous task to match the following code, which returns the
reseller profit with calendar years on columns and sales territory groups on rows:

SELECT NONEMPTY([Order Date].[Calendar Date].[Calendar Year].MEMBERS) ON COLUMNS,


NONEMPTY([Sales Territory].[Sales Territory].[Sales Territory Group].MEMBERS) ON ROWS
FROM [Sales]
WHERE [Measures].[Reseller Profit];

2. Click Execute and review the query results.

3. Modify the query to return reseller profit by calendar year and product category, as shown in the
following code:
MCT USE ONLY. STUDENT USE PROHIBITED
L5-3

SELECT NONEMPTY([Order Date].[Calendar Date].[Calendar Year].MEMBERS) ON COLUMNS,


NONEMPTY([Product].[Categorized Products].[Category].MEMBERS) ON ROWS
FROM [Sales]
WHERE [Measures].[Reseller Profit];

4. Click Execute and review the query results.

5. Close SQL Server Management Studio without saving any items.

Results: After this exercise, you should have written MDX queries to return data from the Sales cube.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
L6-1

Module 6: Enhancing a Cube


Lab: Customizing a Cube
Exercise 1: Implementing an Action
Task 1: Prepare the Lab Environment
1. Ensure the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then log
on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. In the D:\Labfiles\Lab06\Starter folder, right-click Setup.cmd and click Run as administrator.

3. Click Yes when prompted to confirm you want to run the command file, and wait for the script to
finish.

Task 2: Create a Drill-through Action


1. Start Visual Studio and open the Adventure Works OLAP.sln solution in the
D:\Labfiles\Lab06\Starter folder.

2. On the Build menu, click Deploy Solution. If prompted for impersonation credentials, enter the
password Pa$$w0rd for ADVENTUREWORKS\ServiceAcct and click OK.

3. In Solution Explorer double-click Sales.cube to open it in the cube designer.

4. In the cube designer, click the Actions tab.


5. On the Cube menu, click New Drillthrough Action.

6. In the Name box, change the name of this action to Internet Sales Details.

7. In the Measure group members list, select Internet Sales.


8. In the Drillthrough Columns box, in the Select Dimensions list, click Customer, and then in the
Return Columns list, select City and Full Name, and click OK.

9. In the Drillthrough Columns box, in the Select Dimensions list, click Order Date, and then in the
Return Columns list, select Full Date Alternate Key, and click OK.

10. In the Drillthrough Columns box, in the Select Dimensions list, click Product, and then in the
Return Columns list, select English Product Name, and click OK.

11. Expand Additional Properties, and in the Caption box, type Drillthrough to Order Details.

12. On the File menu, click Save All.

Task 3: Browse a Drill-through Action


1. In Visual Studio, on the Build menu, click Deploy Solution. If prompted for impersonation
credentials, enter the password Pa$$w0rd for ADVENTUREWORKS\ServiceAcct and click OK.

2. In the cube designer, click the Browser tab.

3. On the Cube menu, click Analyze in Excel. If a security notice is displayed, click Enable.

4. In Excel, in the PivotTable Fields pane, under Internet Sales, select Internet Revenue.

5. In the PivotTable Fields pane, under Order Date, select Order Date.Calendar Date.

6. In the PivotTable Fields pane, under Product, select Categorized Products.

7. In the PivotTable, right-click the sales amount for Bikes in 2007, point to Additional Actions, and
click Drillthrough to Order Details.
MCT USE ONLY. STUDENT USE PROHIBITED
L6-2 Implementing Data Models and Reports with Microsoft SQL Server

8. View the new worksheet generated by the drill-through action, noting that it shows the city, customer
name, order date, and product name for each sale of a bike in 2007.

9. Close Excel without saving the workbook.

10. Keep Visual Studio open for the next exercise.

Results: After this exercise, you should have defined a drill-through action.

Exercise 2: Implementing Perspectives


Task 1: Create Perspectives
1. In Visual Studio, in the Sales.cube designer, click the Perspectives tab.

2. On the Cube menu, click New Perspective.

3. Change the name of the new perspective to Internet Sales.

4. Clear the check box for the following objects:


o The Reseller Sales measure group (this clears all of the measures in this measure group).

o The Reseller dimension.

o The Reseller Profit calculated member.

5. On the Cube menu, click New Perspective.

6. Change the name of the new perspective to Reseller Sales.

7. Clear the check box for the following objects:


o The Internet Sales measure group (this clears all the measures in this measure group).

o The Customer dimension.

o The Internet Sales Details action.


o The Internet Profit calculated member.

8. On the File menu, click Save All.

Task 2: Browse Perspectives


1. In Visual Studio, on the Build menu, click Deploy Solution. If prompted for impersonation
credentials, enter the password Pa$$w0rd for ADVENTUREWORKS\ServiceAcct and click OK.

2. When deployment has successfully completed, in the cube designer, click the Browser tab, and click
the Reconnect icon.

3. In the cube selection area above the Metadata pane, click the ellipses (), select Internet Sales, and
click OK.

4. In the Metadata pane, expand Measures, and note that only the Internet Sales measure group is
shown in this perspective.

5. Expand Internet Sales and drag the Internet Revenue measure to the query results area.

6. In the Metadata pane, expand the Customer dimension and drag the Yearly Income hierarchy to
the left of the Internet Revenue figure. This displays Internet revenue for customers broken down by
income band.

7. On the Cube menu, click Analyze in Excel.


MCT USE ONLY. STUDENT USE PROHIBITED
L6-3

8. In the Analyze in Excel dialog box, select the Reseller Sales perspective and click OK. If a security
notice is displayed, click Enable.

9. In Excel, in the PivotTable Fields pane, note that only the Reseller Sales measures are shown.

10. In the PivotTable Fields pane, under Reseller Sales, select Reseller Revenue.

11. In the PivotTable Fields pane, under Reseller, select Business Type. The PivotTable shows revenue
for reseller sales broken down by business type.

12. Close Excel without saving the workbook.

Results: After this exercise, you should have defined a perspective and browsed the cube using the new
perspective.

Exercise 3: Implementing a Translation


Task 1: Create Dimension Translations
1. In Visual Studio, in Solution Explorer, in the Dimensions folder, double-click Date.dim, to open it in
the dimension designer, and then click the Translations tab.

2. On the Dimension menu, click New Translation. Then, in the Select Language dialog box, click
French (France), and click OK.

3. In the row for the Calendar Date hierarchy, type Date du Calendrier in the French (France)
translation column.
4. In the row for the Calendar Year level, type Anne in the French (France) translation column.

Tip: To type hold the Alt key and type 130 using the number pad on your keyboard, ensuring Num
Lock is turned on. If this is not possible, type the captions without accents.

5. In the row for the Calendar Semester level, type Semestre in the French (France) translation
column.

6. In the row for the Calendar Quarter level, type Trimestre in the French (France) translation column.

7. In the row for the Month level, type Mois in the French (France) translation column.

8. In the row for the Day level, type Journe in the French (France) translation column.

9. On the Translation tab toolbar, click Show All Attributes. This reveals the attributes that are hidden
in the dimension.

10. Click the ellipses () button for the French (France) translation for the Month attribute, and in the
Attribute Data Translation dialog box, in the Translation columns list, click FrenchMonthName,
and click OK.

11. On the File menu, click Save All.

12. Close the Date.dim dimension designer.

Task 2: Create Cube Translations


1. In Solution Explorer double-click Sales.cube to view it in the cube designer.

2. In the cube designer, click the Translations tab.

3. On the Cube menu, click New Translation. Then, in the Select Language dialog box, click French
(France), and click OK.
MCT USE ONLY. STUDENT USE PROHIBITED
L6-4 Implementing Data Models and Reports with Microsoft SQL Server

4. In the row for the Internet Sales measure group, in the French (France) translation column, type
Ventes d'Internet.

5. In the row for the Internet Revenue measure, in the French (France) translation column, type
Revenu dInternet.

6. In the row for the Order Date dimension, in the French (France) translation column, type Date de
Vente.

7. On the File menu, click Save All.

Task 3: Browse Translations


1. In Visual Studio, on the Build menu, click Deploy Solution. If prompted for impersonation
credentials, enter the password Pa$$w0rd for ADVENTUREWORKS\ServiceAcct and click OK.

2. When deployment has successfully completed, in the cube designer, click the Browser tab, and click
the Reconnect icon.

3. On the toolbar, in the Language list, click French (France).

4. In the Metadata pane, expand Measures, expand Ventes dInternet, and drag Revenu dInternet
to the query results area.
5. In the Metadata pane, expand the Date de Vente dimension, and drag the Date de Vente.Date du
Calendrier hierarchy to the left of drag Revenu dInternet figure. Note that the captions are
displayed in French, and that the month names in the Mois column are also in French.

Results: After this exercise, you should have specified translations for the time dimension metadata and
the Adventure Works cube metadata, and browsed the cube using the new translations.
MCT USE ONLY. STUDENT USE PROHIBITED
L7-1

Module 7: Implementing an Analysis Services Tabular Data


Model
Lab: Implementing an Analysis Services
Tabular Data Model
Exercise 1: Creating an Analysis Services Tabular Data Model Project
Task 1: Prepare the Lab Environment
1. Ensure the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then log
on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. In the D:\Labfiles\Lab07\Starter folder, right-click Setup.cmd and click Run as administrator.

3. When prompted, click Yes to confirm you want to run the command file, and wait for the script to
finish.

Task 2: Create a Tabular Analysis Services Project


1. On the taskbar, click Visual Studio.

2. On the File menu, point to New, and then click Project.


3. In the New Project dialog box, click Analysis Services Tabular Project. In the Name text box, type
AWSalesTab, in the Location box browse to D:\Labfiles\Lab07\Starter, and then click OK.

4. If the Tabular model designer dialog box is displayed, in the Workspace server list, select
localhost\SQL2, and in the Compatibility level box, select SQL Server 2014 / SQL Server 2012 SP1
(1103), and then click OK.

Task 3: Import Tables Into the Data Model


1. In Solution Explorer, double-click Model.bim to open the model.
2. On the Model menu, click Import from Data Source.

3. In the Table Import Wizard, on the Connect to a Data Source page, select Microsoft SQL Server,
and then click Next.

4. On the Connect to a Microsoft SQL Server Database page, in the Friendly connection name box,
type AdventureWorksDW, in the Server name box, type MIA-SQL, ensure that Use Windows
Authentication is selected, and in the Database name list, select AdventureWorksDW, and then
click Next.

5. On the Impersonation Information page, in the User Name box, type


ADVENTUREWORKS\ServiceAcct, in the Password box, type Pa$$w0rd, and click Next.

6. On the Choose How to Import the Data page, ensure that Select from a list of tables and views
to choose the data to import is selected, and then click Next.

7. On the Select Tables and Views page, select the following source tables, changing the Friendly
Name as indicated in parenthesis:

o DimDate (Date)

o DimEmployee (Employee)

o DimGeography (Geography)

o DimProduct (Product)
MCT USE ONLY. STUDENT USE PROHIBITED
L7-2 Implementing Data Models and Reports with Microsoft SQL Server

o DimProductCategory (Product Category)

o DimProductSubcategory (Product Subcategory)

o DimReseller (Reseller)

o FactResellerSales (Reseller Sales)

8. Select the row for the DimDate table, and click Preview & Filter. Then clear the following columns
and click OK:

o SpanishDayNameOfWeek

o FrenchDayNameOfWeek

o DayNumberOfYear

o WeekNumberOfYear

o SpanishMonthName
o FrenchMonthName

o CalendarSemester

o FiscalSemester

9. Select the row for the DimEmployee table, and click Preview & Filter. Then clear the following
columns and click OK:

o SalesTerritoryKey

o NameStyle

o Title

o HireDate

o BirthDate

o LoginID

o EmailAddress

o Phone

o MaritalStatus

o EmergencyContactName

o EmergencyContactPhone

o SalariedFlag

o Gender

o PayFrequency

o Baserate

o VacationHours

o SickLeaveHours

o CurrentFlag

o SalesPersonFlag
o StartDate
MCT USE ONLY. STUDENT USE PROHIBITED
L7-3

o EndDate

o Status

10. Select the row for the DimGeography table, and click Preview & Filter. Then clear the following
columns and click OK:

o SpanishCountryRegionName
o FrenchCountryRegionName

o IpAddressLocator

11. Select the row for the DimProduct table, and click Preview & Filter. Then clear the following
columns and click OK:

o WeightUnitMeasureCode

o SizeUnitMeasureCode
o SpanishProductName

o FrenchProductName

o FinishedGoodsFlag

o SafetyStockLevel

o ReorderPoint

o DaysToManufacture
o ProductLine

o DealerPrice

o Class
o Style

o ModelName

o FrenchDescription
o ChineseDescription

o ArabicDescription

o HebrewDescription

o ThaiDescription

o GermanDescription

o JapaneseDescription

o TurkishDecsription

o StartDate

o EndDate

o Status

12. Select the row for the DimProductCategory table, and click Preview & Filter. Then clear the
following columns and click OK:

o SpanishProductCategoryName

o FrenchProductCategoryName
MCT USE ONLY. STUDENT USE PROHIBITED
L7-4 Implementing Data Models and Reports with Microsoft SQL Server

13. Select the row for the DimProductSubcategory table, and click Preview & Filter. Then clear the
following columns and click OK:

o SpanishProductSubcategoryName

o FrenchProductSubcategoryName

14. Select the row for the DimReseller table, and click Preview & Filter. Then clear the following
columns and click OK:

o OrderFrequency

o OrderMonth

o FirstOrderYear

o LastOrderYear

o ProductLine
o AddressLine1

o AddressLine2

o AnnualSales

o BankName

o MinPaymentType

o MinPaymentAmount
o AnnualRevenue

o YearOpened

15. Select the row for the FactResellerSales table, and click Preview & Filter. Then clear the following
columns and click OK:

o DueDateKey

o PromotionKey

o CurrencyKey

o SalesTerritoryKey

o RevisionNumber

o CarrierTrackingNumber

o CustomerPONumber

o DueDate

16. When you have selected and filtered the tables, in the Table Import Wizard dialog box, click Finish
and wait for the data to be imported. When the data has been imported successfully, click Close.

Task 4: Create Measures


1. In the Model.bim pane, on the Reseller Sales tab, select the first empty cell in the grid under the
OrderQuantity column.

2. On the Column menu, point to AutoSum, and click Sum. Then in the formula bar, modify the
expression that has been generated to name the measure Quantity, as shown in the following
example:

Quantity:=Sum([OrderQuantity])
MCT USE ONLY. STUDENT USE PROHIBITED
L7-5

3. Widen the OrderQuantity column to see the calculated Quantity measure, and click the cell in the
measure grid containing the calculated value.

4. Press F4 to view the Properties pane, and set the Format property to Whole Number and the Show
Thousand Separator property to True.

5. Click the first empty cell under the TotalProductCost column, and on the Column menu, point to
AutoSum, and click Sum. Then modify the expression that is generated to name the measure Cost,
as shown in the following example:

Cost:=Sum([TotalProductCost])

6. Widen the TotalProductCost column to see the calculated Cost measure.

7. Select the cell containing the calculated Cost measure, and in the Properties pane, ensure that the
Format property is set to Currency.

8. Click the first empty cell under the SalesAmount column, and on the Column menu, point to
AutoSum, and click Sum. Then modify the expression that is generated to name the measure
Revenue, as shown in the following example:

Revenue:=Sum([SalesAmount])

9. Widen the SalesAmount column to see the calculated Revenue measure,


10. Select the cell containing the calculated Revenue measure, and in the Properties pane, ensure that
the Format property is set to Currency.

Task 5: Test the Model


1. In Visual Studio, with the Model.bim pane open, on the Model menu, click Analyze in Excel.
2. In the Analyze in Excel dialog box, ensure that Current Windows User is selected and that the
(Default) perspective is specified, and click OK.

3. When Excel opens, note that the measures you defined are displayed under a Reseller Sales measure
group in the PivotTable Fields pane. The Reseller Sales table is also shown in this pane, and contains
the columns in the table on which the measures are based, which some users might find confusing.
You will fix this problem in the next exercise.

4. Select the Revenue measure and note that it is displayed in the PivotTable then, in the Geography
table, select EnglishCountryRegionName. The revenue is shown for each country or region, but the
column name used to display this is not user-friendly.

5. Clear the EnglishCountryRegionName field. Then in the Date table, select CalendarYear and
EnglishMonthName. The revenue is shown for each year, and each month. However, the months are
listed in alphabetical rather than chronological order. Additionally, the Reseller Sales table contains
two date keys (OrderDateKey and ShipDateKey), but it is unclear which of these dates is used to
determine the year and month. You will fix these problems in the next exercise.

6. Close Excel without saving the workbook.

Results: After this exercise, you should have created a tabular data model project.
MCT USE ONLY. STUDENT USE PROHIBITED
L7-6 Implementing Data Models and Reports with Microsoft SQL Server

Exercise 2: Configuring Columns and Relationships


Task 1: Configure Relationships
1. In Visual Studio, with the Model.bim pane visible, on the Model menu, point to Model View and
click Diagram View. The tables are shown as a schema diagram, with lines between them to denote
relationships. Note that there are two relationships between the Reseller Sales table and the Date
table.

2. Double-click the solid line between the Reseller Sales and Date tables and note the columns that are
used to define the relationship for this active relationship. Then click Cancel.
3. Double-click the dotted line between the Reseller Sales and Date tables and note the columns that
are used to define the relationship for this inactive relationship. Then click Cancel.

4. If the active relationship joins the tables based on the ShipDateKey column in the Reseller Sales
table, right-click the dotted line and click Mark as Active so that the active relationship is based on
the OrderDateKey.

5. Right-click the dotted relationship line (which should now represent the relationship on the
ShipDateKey column), and click Delete. When prompted to delete the relationship from the model,
click Delete from Model.

6. Right-click the Date table title bar and click Rename. Then rename the table to Order Date.
7. On the Model menu, click Existing Connections.

8. In the Existing Connections dialog box, ensure that the AdventureWorksDW connection is selected
and click Open. If you are prompted for impersonation credentials, enter the user name
ADVENTUREWORKS\ServiceAcct and the password Pa$$w0rd, and click OK.

9. On the Choose How to Import the Data page, ensure that Select from a list of tables and views
to choose the data to import is selected, and then click Next.

10. On the Select Tables and Views page, select the DimDate table and change the Friendly Name to
Ship Date.

11. Select the row for the DimDate table, and click Preview & Filter. Then clear the following columns
and click OK:

o SpanishDayNameOfWeek

o FrenchDayNameOfWeek
o DayNumberOfYear

o WeekNumberOfYear

o SpanishMonthName

o FrenchMonthName

o CalendarSemester

o FiscalSemester

12. Click Finish and wait for the table to be imported. When it has been imported successfully, click
Close.

13. Arrange the diagram so you can see the Ship Date and Reseller Sales tables, and then drag the
ShipDateKey column from the Reseller Sales table to the DateKey column in the Ship Date table to
create the relationship.
MCT USE ONLY. STUDENT USE PROHIBITED
L7-7

Task 2: Rename and Hide Columns


1. In Visual Studio, with the Model.bim pane visible, in the diagram, click the title bar of the
Geography table. Then click its Maximize icon.

2. In the maximized Geography table, click the GeographyKey column, hold the Ctrl key, and click the
SalesTerritoryKey column to select both columns. Then right-click either of the selected columns
and click Hide from Client Tools.

3. In the maximized Geography table, right-click the StateProvinceCode column and click Rename.
Then rename the column to State or Province Code.

4. Repeat the previous step to rename the following columns in the Geography table.

Column New Name

StateProvinceName State or Province

CountryRegionCode Country or Region Code

EnglishCountryRegionName Country or Region

PostalCode Postal Code

5. Click the Restore icon for the Geography table.

6. Maximize the Reseller table, and then hide the ResellerKey, GeographyKey, and
ResellerAlternateKey columns and rename the following columns:

Column New Name

BusinessType Business Type

ResellerName Reseller Name

NumberEmployees Employees

7. Restore the Reseller table and maximize the Employee table. Then hide the EmployeeKey,
ParentEmployeeKey, EmployeeNationalIDAlternateKey, and
ParentEmployeeNationalIDAlternateKey columns and rename the following columns:

Column New Name

FirstName First Name

LastName Last Name

MiddleName Middle Name

DepartmentName Department

EmployeePhoto Photo
MCT USE ONLY. STUDENT USE PROHIBITED
L7-8 Implementing Data Models and Reports with Microsoft SQL Server

8. Restore the Employee table and maximize the Order Date table. Then hide the DateKey,
DayNumberOfWeek, and MonthNumberOfYear columns and rename the following columns:

Column New Name

FullDateAlternateKey Date

EnglishDayNameOfWeek Weekday

DayNumberOfMonth Day of Month

EnglishMonthName Month

CalendarQuarter Calendar Quarter

CalendarYear Calendar Year

FiscalQuarter Fiscal Quarter

FiscalYear Fiscal Year

9. Restore the Order Date table and maximize the Ship Date table. Then hide the DateKey,
DayNumberOfWeek, and MonthNumberOfYear columns and rename the following columns.

Column Name

FullDateAlternateKey Date

EnglishDayNameOfWeek Weekday

DayNumberOfMonth Day of Month

EnglishMonthName Month

CalendarQuarter Calendar Quarter

CalendarYear Calendar Year

FiscalQuarter Fiscal Quarter

FiscalYear Fiscal Year


MCT USE ONLY. STUDENT USE PROHIBITED
L7-9

10. Restore the Ship Date table and maximize the Product table. Then hide the ProductKey,
ProductAlternateKey, and ProductSubcategoryKey columns and rename the following columns:

Column New Name

EnglishProductName Product Name

StandardCost Standard Cost

ListPrice List Price

SizeRange Size Range

LargePhoto Photo

EnglishDescription Description

11. Restore the Product table and maximize the Product Subcategory table. Then hide the
ProductSubcategoryKey, ProductSubcategoryAlternateKey, and ProductCategoryKey columns
and rename the following EnglishProductSubcategoryName column to Subcategory.

12. Restore the Product Subcategory table and maximize the Product Category table. Then hide the
ProductCategoryKey and ProductCategoryAlternateKey columns and rename the
EnglishProductCategoryName column to Category.

13. Restore the Product Category table and maximize the Reseller Sales table. Then hide all columns
other than the Quantity, Cost, and Revenue measures you created in the previous exercise, and
restore the Reseller Sales table.

Task 3: Configure Column Sort Order


1. In Visual Studio, with the Model.bim pane open in diagram view, on the Model menu, point to
Model View and click Data View.

2. On the Ship Date tab, click the column heading for the Weekday column. Then in the Column
menu, point to Sort and click Sort by Column.

3. In the Sort By Column dialog box, in the Sort column, ensure that Weekday is selected, and, in the
By column, select DayNumberOfWeek. Then click OK.

4. On the Ship Date tab, click the column heading for the Month column. Then in the Column menu,
point to Sort and click Sort by Column.
5. In the Sort By Column dialog box, in the Sort column, ensure that Month is selected, and, in the By
column, select MonthNumberOfYear. Then click OK.

6. On the Order Date tab, click the column heading for the Weekday column. Then in the Column
menu, point to Sort and click Sort by Column.

7. In the Sort By Column dialog box, in the Sort column, ensure that Weekday is selected, and, in the
By column, select DayNumberOfWeek. Then click OK.
8. On the Order Date tab, click the column heading for the Month column. Then in the Column menu,
point to Sort and click Sort by Column.

9. In the Sort By Column dialog box, in the Sort column, ensure that Month is selected and, in the By
column, select MonthNumberOfYear. Then click OK.
MCT USE ONLY. STUDENT USE PROHIBITED
L7-10 Implementing Data Models and Reports with Microsoft SQL Server

Task 4: Create Hierarchies


1. In Visual Studio, with the Model.bim pane visible, on the Model menu, point to Model View and
click Diagram View.

2. In the diagram, click the title bar of the Geography table. Then click its Maximize icon.

3. In the title bar of the maximized Geography table, click the Create Hierarchy icon, and then rename
the new hierarchy to Location.

4. In the maximized Geography table, drag the Country or Region column onto the Location
hierarchy. Then drag the State or Province column onto the Location hierarchy, drag the City
column onto the Location hierarchy, and drag the Postal Code column onto the Location hierarchy.

5. Restore the Geography table, and maximize the Order Date table.

6. In the maximized Order Date table, create a hierarchy named Calendar Date that contains the
following fields:
o Calendar Year

o Calendar Quarter

o Month

o Day of Month

7. In the maximized Order Date table, create a second hierarchy named Fiscal Date that contains the
following fields:

o Fiscal Year

o Fiscal Quarter

o Month

o Day of Month

8. Restore the Order Date table.

Task 5: Test the Model


1. In Visual Studio, with the Model.bim pane open, on the Model menu, click Analyze in Excel.
2. In the Analyze in Excel dialog box, ensure that Current Windows User is selected and that the
(Default) perspective is specified, and click OK.

3. In Excel, in the PivotTable Fields pane, select Revenue. The total revenue is shown in the PivotTable.

4. In the Order Date table, select the Calendar Date hierarchy. The revenue for each calendar year is
shown.

5. In the PivotTable, expand 2007 to reveal the quarterly revenue, and then expand quarter 3 to show
the monthly revenue. Note that the months are displayed in chronological order.

6. Close Excel without saving the workbook.

Results: After completing this exercise, you should have a tabular data model that includes renamed
columns, custom sort orders, and specific relationships between tables.
MCT USE ONLY. STUDENT USE PROHIBITED
L7-11

Exercise 3: Deploying an Analysis Services Tabular Data Model


Task 1: Deploy the Reseller Sales Project
1. In Visual Studio, in Solution Explorer, right-click the AWSalesTab project and click Properties.

2. In the AWSalesTab Property Pages dialog box, in the Deployment Server section, verify that the
Server value is localhost\SQL2.

3. Change the Database property to AdventureWorksTab and change the Cube Name property to
Reseller Sales. Then click OK.

4. On the Build menu, click Deploy AWSalesTab.


5. In the Deploy dialog box, when deployment has completed, click Close.

6. Close Visual Studio.

Task 2: Use the Deployed Tabular Database


1. Start Excel and create a new blank document.
2. On the Data tab, in the Get External Data area (or drop-down list, depending on your screen
resolution) in the From Other Sources drop-down list, select From Analysis Services.

3. In the Data Connection Wizard dialog box, in the Server name box, type MIA-SQL\SQL2, ensure
that Use Windows Authentication is selected, and click Next.

4. On the Select Database and Table page, ensure that AdventureWorksTab database and the
Reseller Sales cube are selected. Then click Next.
5. On the Save Data Connection File and Finish page, click Finish.

6. In the Import Data dialog box, ensure that the Existing Worksheet option is selected and click OK.

7. In the PivotTable Fields pane, under Reseller Sales, select Revenue.

8. In the Geography table, select the Location hierarchy.

9. Drag the Calendar Date hierarchy to the Columns area of the PivotTable Fields pane.

10. Explore the data in the PivotTable. When you have finished, close Excel without saving the workbook.

Results: After this exercise, you should have deployed the tabular data model project.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
L8-1

Module 8: Introduction to DAX


Lab: Using DAX to Enhance a Tabular Data
Model
Exercise 1: Creating Calculated Columns
Task 1: Prepare the Lab Environment
1. Ensure the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then log
on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. In the D:\Labfiles\Lab08\Starter folder, right-click Setup.cmd and click Run as administrator.

3. When prompted, click Yes to confirm you want to run the command file, and wait for the script to
finish.

Task 2: Concatenate Text Values


1. Start Visual Studio and open the AWSalesTab.sln solution in the D:\Labfiles\Lab08\Starter folder. If
the Tabular model designer dialog box is displayed, in the Workspace server list, specify
localhost\SQL2 and then click OK.
2. In Solution Explorer, double-click Model.bim to open the data model designer. If you are prompted
to run a script on the server, click Yes.

3. On the Model menu, point at Process, and click Process All. If you are prompted for impersonation
credentials, specify the user name ADVENTUREWORKS\ServiceAcct with the password Pa$$w0rd
and click OK. Then when all of the tables have been processed, click Close.

4. On the Employee tab, after the existing columns, double-click the Add Column header and enter
the column name Employee Name.

5. With the Employee Name column selected, in the formula bar, enter the following DAX expression:

=[First Name] & IF(ISBLANK([Middle Name]), "", CONCATENATE(" ", [Middle Name])) & CONCATENATE(" ",
[Last Name])

6. View the results and verify that they show the full employee name (including middle name if present).

7. Click the First Name column header, and then hold Shift and click the Last Name column headers
to select the First Name, Middle Name, and Last Name columns. Then right-click any of the select
columns and click Hide from Client Tools.

8. On the File menu, click Save All.

Task 3: Calculate a Numeric Value


1. On the Reseller Sales tab, in the first empty column after the existing ones, double-click Add
Column and name the new column SalesProfit.

2. With the SalesProfit column selected, in the formula bar, enter the following DAX expression:

=[SalesAmount]-[TotalProductCost]

3. Wait for the table to finish updating, and note the calculated values in the new column.

4. Press F4 and view the Properties pane to confirm that the Data Format property for this column has
been set to Currency.
MCT USE ONLY. STUDENT USE PROHIBITED
L8-2 Implementing Data Models and Reports with Microsoft SQL Server

Task 4: Retrieve Related Values


1. On the Product tab, after the existing columns, double-click the Add Column header and enter the
column name Subcategory.

2. With the Subcategory column selected, in the formula bar, enter the following DAX expression:

=RELATED('Product Subcategory'[Subcategory])

3. Scroll down to verify that products with a subcategory are shown with the subcategory name. Note
that some products at the beginning of the table are uncategorized.

4. After the existing columns, double-click the Add Column header and enter the column name
Category.

5. With the Category column selected, in the formula bar, enter the following DAX expression:

=RELATED('Product Category'[Category])

6. Scroll down to verify that products with a category are shown with the category name. Note that
some products at the beginning of the table are uncategorized.

7. On the Model menu, point to Model View and click Diagram View.
8. In the diagram, click the title bar of the Product table. Then click its Maximize icon.

9. In the maximized Product table, click the Create Hierarchy icon and create a hierarchy named
Categorized Products.
10. Drag the following columns to the Categorized Products hierarchy:

o Category

o Subcategory
o Product Name

11. Restore the Product table.

12. On the Model menu, point to Model View and click Data View.

13. On the Product Subcategory tab, right-click the Subcategory column, which should be the only
visible column, and click Hide from Client Tools.

14. In the Product Category table, right-click the Category column, which should be the only visible
column, and click Hide from Client Tools.

15. On the File menu, click Save All.

Task 5: View Calculated Columns in Excel


1. On the Model menu, click Analyze in Excel.

2. In the Analyze in Excel dialog box, ensure that Current Windows User is selected and that the
(Default) perspective is specified, and click OK.

3. When Excel opens, in the Reseller Sales table, select the Revenue measure.

4. In the Product table, select Categorized Products so that the PivotTable shows sales revenue for
each product category.

5. In the Employee table, drag the Employee Name column to the Columns area so that the
PivotTable shows sales revenue for each employee.
MCT USE ONLY. STUDENT USE PROHIBITED
L8-3

6. Expand product categories and subcategories to verify that the hierarchy shows subcategory and
product names.

7. Close Excel without saving the workbook.

Results: After this exercise, you should have a calculated column named Employee Name in the
Employee table, a calculated measure named SalesProfit in the Reseller Sales table, and a hierarchy
named Categorized Products in the Product table.

Exercise 2: Creating Measures


Task 1: Create a Measure that Aggregates a Column
1. In Visual Studio, in the data model designer, on the Reseller Sales tab, click the first empty cell in the
measure grid under the SalesProfit column.

2. In the formula bar, enter the following DAX expression:

a. Profit:=SUM([SalesProfit])

3. Select the cell containing the Profit measure, and in the properties pane, set the Format property to
Currency.

4. Right-click the column header for the SalesProfit column and click Hide from Client Tools.
5. On the File menu, click Save All.

Task 2: Create a Measure that References Other Measures


1. On the Reseller Sales tab, click the empty cell in the measure grid under the Profit measure.

2. In the formula bar, enter the following DAX expression:

Margin:=[Profit]/[Revenue]

3. Select the cell containing the Margin measure, and in the properties pane, set Format to Currency.

4. On the File menu, click Save All.

Task 3: Create a Measure that Uses Time Intelligence


1. In the data model designer, click the Order Date tab.

2. On the Table menu, point at Date and click Mark as Date Table.

3. In the Mark as Date Table dialog box, select the Date column, and click OK.

4. On the Reseller Sales tab, click the empty cell in the measure grid under the Revenue measure.

5. In the formula bar, enter the following DAX expression:

Previous Year Revenue:=CALCULATE(SUM([SalesAmount]), SAMEPERIODLASTYEAR('Order Date'[Date]))

6. Select the cell containing the Previous Year Revenue measure, and in the properties pane, set the
Format property to Currency.

7. On the File menu, click Save All.


MCT USE ONLY. STUDENT USE PROHIBITED
L8-4 Implementing Data Models and Reports with Microsoft SQL Server

Task 4: View Measures in Excel


1. On the Model menu, click Analyze in Excel.

2. In the Analyze in Excel dialog box, ensure that Current Windows User is selected and that the
(Default) perspective is specified, and click OK.

3. When Excel opens, in the Reseller Sales table, select the Revenue, Profit, Margin, and Previous
Year Revenue measures.

4. In the Order Date table, drag the Calendar Date hierarchy to the Rows area so that the PivotTable
shows revenue, profit, margin, and previous year revenue for each year.

5. Expand 2006 and 2007, and note that the revenue for each quarter in 2006 is shown as the previous
year revenue for the same quarters in 2007.

6. Close Excel without saving the workbook.

Results: At the end of this exercise, the Reseller Sales table should contain the following measures:

Profit
Margin

Previous Year Revenue

Exercise 3: Creating a KPI


Task 1: Create a Measure to Calculate a KPI Goal
1. In Visual Studio, in the data model designer, on the Reseller Sales tab, click the empty cell in the
measure grid under the Previous Year Revenue measure.

2. In the formula bar, enter the following DAX expression:

Revenue Goal:=[Previous Year Revenue] * 1.2

3. Select the cell containing the Revenue Goal measure, and in the properties pane, set the Format
property to Currency.

4. Right-click the cell containing the Revenue Goal measure, and click Hide from Client Tools.

5. On the File menu, click Save All.

Task 2: Create a KPI


1. On the Reseller Sales tab, right-click the cell in the measure grid containing the Revenue measure
and click Create KPI.

2. In the Key Performance Indicator (KPI) dialog box, Note that the KPI base measure (value) is
defined by the Revenue measure. Then, under Define target value, ensure that Measure is selected
and select the Revenue Goal measure.

3. Set the first status threshold to 75% and the second to 95%.

4. Note the default icon style, and click OK.

5. On the File menu, click Save All.


MCT USE ONLY. STUDENT USE PROHIBITED
L8-5

Task 3: View a KPI in Excel


1. On the Model menu, click Analyze in Excel.

2. In the Analyze in Excel dialog box, ensure that Current Windows User is selected and that the
(Default) perspective is specified, and click OK.

3. When Excel opens, in the PivotTable Fields pane, expand KPIs and expand Revenue. Then, select
Value, Goal, and Status.

4. In the Order Date table, drag the Fiscal Date hierarchy to the Rows area so that the PivotTable
shows the KPI data for each year.

5. Expand 2008 and its quarters and months to view the status of the KPI for each month.
6. Close Excel without saving the workbook.

Results: At the end of this exercise, the Reseller Sales table should include a measure named Revenue
Goal and a KPI based on the Revenue measure.

Exercise 4: Implementing a Parent-Child Hierarchy


Task 1: Create a Path Column
1. In Visual Studio, in the data model designer, click the Employee tab.
2. After the existing columns, double-click the Add Column header and enter the column name Path.

3. With the new Path column selected, in the formula bar, enter the following DAX expression:

=PATH([EmployeeKey], [ParentEmployeeKey])

4. Wait for the table to finish updating, and note the calculated values in the new column.
5. On the File menu, click Save All.

Task 2: Create a Column for Each Hierarchy Level


1. After the existing columns, double-click the Add Column header and enter the column name Level1.

2. With the new Level1 column selected, in the formula bar, enter the following DAX expression:

=LOOKUPVALUE ([Employee Name], [EmployeeKey], PATHITEM ([Path], 1, 1))

3. After the existing columns, double-click the Add Column header and enter the column name Level2.

4. With the new Level2 column selected, in the formula bar, enter the following DAX expression:

=LOOKUPVALUE ([Employee Name], [EmployeeKey], PATHITEM ([Path], 2, 1))

5. After the existing columns, double-click the Add Column header and enter the column name Level3.

6. With the new Level3 column selected, in the formula bar, enter the following DAX expression:

=LOOKUPVALUE ([Employee Name], [EmployeeKey], PATHITEM ([Path], 3, 1))

7. After the existing columns, double-click the Add Column header and enter the column name Level4.
MCT USE ONLY. STUDENT USE PROHIBITED
L8-6 Implementing Data Models and Reports with Microsoft SQL Server

8. With the new Level4 column selected, in the formula bar, enter the following DAX expression:

=LOOKUPVALUE ([Employee Name], [EmployeeKey], PATHITEM ([Path], 4, 1))

9. On the File menu, click Save All.

Task 3: Use the Calculated Columns in a Hierarchy


1. On the Model menu, point to Model View, and then click Diagram View. Click the Employee table,
and then click the Maximize button in its title bar.

2. In the title bar of the maximized Employee table, click the Create Hierarchy button. After the new
hierarchy is created, change its name to Employee Hierarchy.

3. Drag the Level1, Level2, Level3, and Level4 columns to the hierarchy in that order.

4. Click the Path column, hold Shift and click the Level4 column. Then right-click the selected columns
and click Hide from Client Tools.

5. Click the Restore button in the title bar of the maximized Employee table.

6. On the File menu, click Save All.

Task 4: View a Parent-Child Hierarchy in Excel


1. On the Model menu, click Analyze in Excel.

2. In the Analyze in Excel dialog box, ensure that Current Windows User is selected and that the
(Default) perspective is specified, and click OK.
3. When Excel opens, in the Reseller Sales table, select the Revenue, measure.

4. In the Employee table, select Employee Hierarchy so that the PivotTable shows revenue for the top-
level employee.
5. Expand the employee hierarchy to view sales totals for managers and their subordinates. Note that
the personal revenue for sales managers is shown with a blank employee name under the total for
that sales manager.

6. Close Excel without saving the workbook.

7. Close Visual Studio.

Results: At the end of this exercise, the Employee table should include a hierarchy named Employee
Hierarchy.
MCT USE ONLY. STUDENT USE PROHIBITED
L9-1

Module 9: Implementing Reports with SQL Server Reporting


Services
Lab: Creating a Report with Report Designer
Exercise 1: Creating a Report
Task 1: Prepare the Lab Environment
1. Ensure that the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then
log on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. In the D:\Labfiles\Lab09\Starter folder, right-click Setup.cmd and click Run as administrator.

3. Click Yes when prompted to confirm you want to run the command file, and wait for the script to
finish.

Task 2: Create a Report Project


1. Start Visual Studio, and on the File menu, point at New and click Project.

2. In the New Project dialog box, in the Business Intelligence section, select the Report Server
Project Wizard template. Then in the Name box type AWReports, in the Location box, browse to
the D:\Labfiles\Lab09\Starter folder; and click OK.

3. On the Welcome to the Report Wizard page of the Report Wizard, click Next.
4. On the Select the Data Source page, enter the following information and click Next:

o Name: AdventureWorksDW

o Type: Microsoft SQL Server


o Connection String: Data Source=MIA-SQL;Initial Catalog=AdventureWorksDW

o Make this a shared data source: Selected

5. On the Design the Query page, click Query Builder. Then in the Query Designer window, click
Import, select the Sales Query.sql Transact-SQL script file in the D:\Labfiles\Lab09\Starter folder, and
click Open.

6. In the Query Designer window, on the toolbar, click Run. Then view the results of the query in the
bottom pane. This is the data for the sales report.

7. In the Query Designer window, click OK, and on the Design the Query page of the Report Wizard,
click Next.
8. In the Select the Report Type page, select Tabular and click Next.

9. In the Design the Table page, click Year, and then while holding the Ctrl key, click Month,
ProductCategory, ProductSubcategory, Product, SalesOrderNumber, OrderDate, and
SalesAmount. Then click Details to add these fields to the details group of the report, and click
Next.

10. In the Choose the Table Style page, view each of the available styles, and then select Generic and
click Next.

11. If the Choose a Deployment Location page is displayed, review the default settings and click Next.

12. In the Completing the Wizard page, change the report name to Internet Sales, and click Finish.
MCT USE ONLY. STUDENT USE PROHIBITED
L9-2 Implementing Data Models and Reports with Microsoft SQL Server

Task 3: Modify Report Properties


1. On the View menu, click Report Data.

2. In the Report Data pane, expand the Datasets folder. Then right-click Dataset1 and click Dataset
Properties.

3. In the Dataset Properties dialog box, change the dataset name to InternetSales, and click OK.

4. In the Report Designer, in the Row Groups pane, right-click (table1_Details_Group) and click Group
Properties.

5. In the Group Properties dialog box, on the General tab, change the name to Sales_Details. Then
click OK.

Task 4: Format the Report


1. In the Report Designer, click the Preview tab and view the report. If a console window opens,
minimize it. Note that the report contains no formatting.

2. In the Report Designer, click the Design tab, and then click in any blank space under the report to
ensure no elements are selected. Note that the report contains two elements a text box containing
the report title, and a tablix containing the report data.

3. Click the border of the text box containing the report title (Internet Sales) and use the formatting
buttons on the toolbar to increase its size to 14pt and make it bold.

4. Click the table containing the report data, so that the gray column and row headers are displayed,
and then click the gray cell where the column and row headers intersect at the top left to select the
table. When the table is selected, click and drag its selection handle (a small square containing four
directional arrows) to move the table down about one centimeter.

5. Select the title text box and resize it to ensure the title is completely visible.

6. Click the table containing the report data, so that the gray column and row headers are displayed,
and click the row header for the first row (which contains the column titles). Then use the formatting
buttons on the toolbar to make the column titles bold, change the background color to light gray,
and align the column titles to the left.

7. Click the row header for the details row and use the formatting buttons on the toolbar to align the
cells to the left.
8. Right-click the [OrderDate] detail cell and click Text Box Properties. On the Number tab, select the
Date category and the 31-Jan-00 format. Then click OK.

9. Right-click the [SalesAmount] detail cell and click Text Box Properties. On the Number tab, select
the Currency category, specify two decimal places, select Use 1000 separator, and select the
($12,345.00) format. Then click OK.

10. Drag the borders of the column headings to resize the columns so that the values in the table fit
them. Switch back and forth between the Preview and Design tabs to check your work.

Results: After this exercise, you should have a report that shows sales data from the
AdventureWorksDW database.
MCT USE ONLY. STUDENT USE PROHIBITED
L9-3

Exercise 2: Grouping and Aggregating Data


Task 1: Delete Columns and Rows
1. In Visual Studio, in the AWReports project, click the table in the Internet Sales report so that the
gray column and row headers are displayed, and then click the column header for the Year column
to select it.

2. Right-click the selected Year column header, and click Delete Columns.

3. Click the gray row heading for the first row (which contains the column titles) to select it.

4. Right-click the selected row header, and click Delete Rows.

Task 2: Group and Sort the Report Data


1. In the Row Groups pane, select the Sales_Details group.

2. Right-click the selected group, point to Add Group, and click Parent Group.

3. In the Tablix group dialog box, in the Group by list, select [Month]. Then select Add group header
and click OK.

4. Click the Preview tab to view the report and note that a new row and column have been added to
the report, and that the Month field is now displayed in both the Month group header and the
details group.

5. Click the Design tab, and then right-click the column containing the Month field in the details
group, and click Delete Columns.

6. Click the Preview tab to view the report and note that the month grouping displays the data in
alphabetical order by month name (so the first month is April, followed by August, and so on).

7. Click the Design tab, and in the Row Groups section of the Report Designer, click the drop-down
arrow for the Month group, and click Group Properties.

8. In the Group Properties dialog box, on the Sorting tab, change the Sort by column to [MonthNo].
Then click OK.

9. Click the Preview tab to view the report and note that the grouping now displays in calendar month
order (so the first month is January, followed by February, and so on).

10. Click the Design tab to view the report in design view, and in the Row Groups section of the Report
Designer, click the drop-down arrow for the Sales_Details group, point to Add Group, and click
Parent Group.

11. In the Tablix group dialog box, in the Group by list, select [ProductCategory]. Then select Add
group header and click OK.

12. Note that a new group header containing the ProductCategory field has been created, and that the
original ProductCategory column in the detail group still exists. Then right-click the column
containing the ProductCategory field in the details group, and click Delete Columns.

13. Repeat the previous three steps to create groups for the [ProductSubcategory] and [Product] fields.

14. Click the Preview tab and note that some group heading columns may not be wide enough to
display the field values. Then click the Design tab and adjust column widths accordingly.
MCT USE ONLY. STUDENT USE PROHIBITED
L9-4 Implementing Data Models and Reports with Microsoft SQL Server

Task 3: Add Aggregate Summary Values


1. In the Row Groups section of the Report Designer, click the drop-down arrow for the Sales_Details
group, point to Add Total, and click Before. Note that a total for the SalesAmount field is added to
the Product group header row.

2. In the Row Groups section of the Report Designer, click the drop-down arrow for the Product
group, point to Add Total, and click Before. Note that a text box with the heading Total and a total
for the SalesAmount field is added to the ProductSubcategory group header row.

3. Repeat the previous step for the ProductSubcategory and ProductCategory groups.

4. Click the Preview tab and note that the totals are calculated for each group.

5. Click the Design tab and click the cell in the Product group heading row immediately above the
SalesOrderNumber field. Then right-click the selected cell and click Expression.

6. In the Expression dialog box, in the Category pane, expand Common Functions and select
Aggregate. Then in the Item pane, double-click Count. Note that the expression value changes to
=Count(.

7. In the Expression dialog box, in the Category pane, select Fields (InternetSales) and in the Item
pane ensure <All> is selected. Then in the Values pane, double-click SalesOrderNumber. Note that
the expression value changes to =Count(Fields!SalesOrderNumber.Value.

8. In the text box containing the incomplete expression value, type ) to complete the expression as
=Count(Fields!SalesOrderNumber.Value). Then click OK.

9. Right-click the cell containing the Count expression, and click Copy.

10. Right-click the cell immediately above the Count expression, and click Paste to copy it to the
ProductSubcategory group heading row.

11. Repeat the previous step to copy the expression to the ProductCategory and Month group heading
rows.
12. Click the Preview tab and note that the Count expression is calculated for each group.

Task 4: Enable Drill-down Interactions


1. Click the Design tab to view the report in design view.

2. In the Row Groups section of the Report Designer, click the drop-down arrow for the Sales_Details
group and click Group Properties.

3. On the Visibility tab, select Hide, select Display can be toggled by this report item, and in the
drop-down list, select Product1. Then click OK.

4. Repeat the previous two steps to hide the following groups:

o Product (toggled by ProductSubcategory1)

o ProductSubcategory (toggled by ProductCategory1)


5. Click the Preview tab and verify that you can expand and collapse the Product Category, Product
Subcategory, and Product groups.

Task 5: Add Page Breaks


1. Click the Design tab to view the report in design view.

2. Click anywhere on the designer surface outside the white body of the report and press F4, then in the
Properties pane, ensure the Report object is selected.

3. Set the InitialPageName property to Sales Summary.


MCT USE ONLY. STUDENT USE PROHIBITED
L9-5

4. In the Row Groups section of the Report Designer, click the Month group, and in the Properties
pane, expand Group and expand PageBreak. Then set the BreakLocation property to Between.

5. With the Month group still selected, click the drop-down arrow for the PageName property and
click Expression.

6. In the Expression dialog box, in the Category pane, select Fields (InternetSales). In the Item pane
ensure <All> is selected, and in the Values pane, double-click Month. Verify that the expression
matches the code sample below, and then click OK.

=Fields!Month.Value

7. Click the table containing the report data, so that the gray column and row headers are displayed,
and then click the gray cell where the column headers and row headers intersect at the top left to
select the table.

8. In the Properties pane, ensure that the drop-down list at the top contains the value table1 Tablix.
Then expand PageBreak and set the BreakLocation property to Start.

9. Click the Preview tab and note that the first page of the report contains only the title. Then use the
page navigation buttons to scroll through the pages of the report, noting that each month starts on a
new page.

Results: After this exercise, you should have a report that includes sales data grouped by month, product
category, subcategory, and product.

Exercise 3: Publishing a Report


Task 1: Deploy Report Items
1. In Solution Explorer, right-click AWReports and click Properties.
2. In the AWReports Property Pages dialog box, set the following property values, and then click OK:

o OverwriteDatasets: True

o OverwriteDataSources: True
o TargetDatasetFolder: http://mia-sql/sites/adventureworks/Reports/Datasets

o TargetDataSourceFolder: http://mia-sql/sites/adventureworks/Reports/Data Sources

o TargetReportFolder: http://mia-sql/sites/adventureworks/Reports

o TargetReportPartFolder: http://mia-sql/sites/adventureworks/Reports/Report Parts

o TargetServerURL: http://mia-sql/sites/adventureworks

o TargetServerVersion: SQL Server 2008 R2 or later


3. On the Build menu, click Deploy AWReports.

4. When deployment has succeeded, close Visual Studio, saving your changes if prompted.

Task 2: View a Published Report


1. Start Internet Explorer and browse to http://mia-sql/sites/adventureworks.

2. In the Quick Launch area, click Reports.

3. In the Reports library, click Internet Sales and wait for the report to be rendered.

4. Click the Next Page button to view sales for January.


MCT USE ONLY. STUDENT USE PROHIBITED
L9-6 Implementing Data Models and Reports with Microsoft SQL Server

5. On the Actions menu, point to Export and click Excel.

6. When prompted, save the report as Internet Sales.xlsx in the D:\Labfiles\Lab09\Starter folder.

7. Wait for the file to be downloaded, and after it is complete, click Open.

8. View the report in Excel, and then close Excel.

9. Close Internet Explorer.

Results: After this exercise, you should have configured and deployed a Report Server project.
MCT USE ONLY. STUDENT USE PROHIBITED
L10-1

Module 10: Enhancing Reports with SQL Server Reporting


Services
Lab: Enhancing a Report
Exercise 1: Adding a Chart to a Report
Task 1: Prepare the Lab Environment
1. Start the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines, and then log on to
20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. In the D:\Labfiles\Lab10\Starter folder, right-click Setup.cmd and click Run as administrator.

3. Click Yes when prompted to confirm you want to run the command file, and wait for the script to
finish.

Task 2: Add a Chart Report Item


1. Start Visual Studio and open the AWReports.sln solution in the D:\Labfiles\Lab10\Starter folder.

2. In Solution Explorer, double-click Internet Sales.rdl to open it in Report Designer.


3. Click the table containing the report data, so that the gray column and row headers are displayed,
and then click the gray cell where the column and row headers intersect at the top left to select the
table. When the table is selected, click and drag its selection handle to move the table down about 10
centimeters (3 inches).

4. If the toolbox is not visible, on the View menu, click Toolbox. Then in the toolbox, click and drag the
Chart component to the space you have created above the table.

5. In the Select Chart Type dialog box, select the Stacked Column chart type (point the mouse over
the chart icons to display a tooltip with name of the chart type). Then click OK.

6. Use the resize and selection handles to fit the chart into the space created when you moved the table.

Task 3: Specify the Data for the Chart


1. Click the chart area to select it, and then if necessary click inside the chart again to display the Chart
Data editing pane.

2. In the Chart Data editing pane, in the Values section, click the Add field button (a yellow plus
symbol) and click SalesAmount.

3. In the Category Groups section, click the drop-down button for the (Details) group and click
Month. Then in the Category Groups section, click the drop-down button for Month and click
Category Group Properties.

4. In the Category Group Properties dialog box, on the Sorting tab, in the Sort by drop-down list,
select MonthNo. Then click OK.
5. In the Chart Data editing pane, in the Series Groups section, click the Add field button and click
ProductCategory.

Task 4: Format the Chart


1. Click the chart area to select it, and then right-click inside it again and click Chart Properties.

2. In the Chart Properties dialog box, in the Color palette drop-down list, select a color palette for the
chart. Then click OK.

3. Right-click the Chart Title text box and click Delete Title.
MCT USE ONLY. STUDENT USE PROHIBITED
L10-2 Implementing Data Models and Reports with Microsoft SQL Server

4. Right-click the vertical axis title and clear the Show Axis Title checkmark.

5. Repeat the previous step to remove the horizontal axis title.

6. Right-click any of the numbers in the vertical axis and click Vertical Axis Properties.

7. In the Vertical Axis Properties dialog box, in the Number tab, select Currency. Then select Use
1000 separator and click OK.
8. Click the Preview tab and note that the first page of the report contains the chart. If a console
window opens, minimize it.

9. Click the Design tab to return to the report designer. Keep Visual Studio open for the next exercise.

Results: After this exercise, you should have a report that includes a chart.

Exercise 2: Adding Parameters to a Report


Task 1: Add a Parameter
1. In Visual Studio, with the Internet Sales report open in the designer, on the View menu, click Report
Data.

2. In the Report Data pane, expand the Datasets folder. Then right-click InternetSales and click
Dataset Properties.

3. In the Dataset Properties dialog box, under the Query text box, click Import and browse to
D:\Labfiles\Lab10\Starter. Then select Parameterized Sales Query.sql and click Open. The new
query is exactly the same as before except for the addition of a WHERE clause that includes a @Year
parameter.

4. In the Dataset Properties dialog box, click OK.


5. In the Report Data pane, expand Parameters and then right-click the Year parameter and click
Parameter Properties.

6. On the General tab, in the data type drop-down list, select Integer, and select Allow multiple
values. Then click OK.

7. In the Report Data pane, right-click the InternetSales dataset and click Dataset Properties. On the
Parameters tab, note that the @Year dataset parameter is mapped to the [@Year] report
parameter. Then click Cancel.

8. Right-click the report title (Internet Sales) and click Expression.

9. In the Expression dialog box, in the Set expression for: Value box, change the expression to the
following code and then click OK:

="Internet Sales for " & Join(Parameters!Year.Value, ", ")

10. Click the Preview tab. If a console window opens, minimize it.

11. In the Year drop-down list, type 2006. Then click View Report. The sales for 2006 are included in the
report.

12. In the Year drop-down list, after 2006 insert a new line and type 2007. Then click View Report. The
sales for 2006 and 2007 are included in the report.

Task 2: Configure Available and Default Parameter Values


1. Click the Design tab to view the Internet Sales report in design view.
MCT USE ONLY. STUDENT USE PROHIBITED
L10-3

2. In the Report Data pane, right-click the Datasets folder, and click Add Dataset.

3. In the Dataset Properties dialog box, change the Name to SalesYears, and select Use a dataset
embedded in my report.

4. In the Data source drop-down list, select AdventureWorksDW. Then under the Query text box,
click Import, browse to D:\Labfiles\Lab10\Starter, select Sales Years.sql, and click Open.

5. View the query, noting that it returns all years for which there have been Internet sales, and then in
the Dataset Properties dialog box, click OK.

6. In the Report Data pane, right-click the Datasets folder, and click Add Dataset.
7. In the Dataset Properties dialog box, change the Name to MaxYear, and select Use a dataset
embedded in my report.

8. In the Data source drop-down list, select AdventureWorksDW. Then under the Query text box,
click Import, browse to D:\Labfiles\Lab10\Starter, select Max Year.sql, and click Open.

9. View the query, noting that it returns the year of the most recent Internet sales order, and then in the
Dataset Properties dialog box, click OK.
10. In the Report Data pane, expand Parameters and then right-click the Year parameter and click
Parameter Properties.

11. In the Parameter Properties dialog box, in the Available Values tab, select Get values from a
query. Then in the Dataset drop-down list, select SalesYears, in the Value field drop-down list,
select Year, and then in the Label field drop-down list, select Year.

12. In the Parameter Properties dialog box, in the Default Values tab, select Get values from a query.
In the Dataset drop-down list, select MaxYear and in the Value field drop-down list, select
MaxYear. Then click OK.

13. Click the Preview tab, and note that the report is displayed using the default parameter value for the
most recent year (which is 2008).

14. Click the Year drop-down list and note that it contains a list of years for which there have been sales.

15. Click the Design tab to return to design view.

Task 3: View a Parameterized Report in SharePoint Server


1. In the Solution Explorer pane, right-click AWReports and click Properties.

2. In the AWReports Property Pages dialog box, verify that the following property values are set, and
then click OK:
o OverwriteDatasets: True

o OverwriteDataSources: True

o TargetDatasetFolder: http://mia-sql/sites/adventureworks/Reports/Datasets

o TargetDataSourceFolder: http://mia-sql/sites/adventureworks/Reports/Data Sources

o TargetReportFolder: http://mia-sql/sites/adventureworks/Reports
MCT USE ONLY. STUDENT USE PROHIBITED
L10-4 Implementing Data Models and Reports with Microsoft SQL Server

o TargetReportPartFolder: http://mia-sql/sites/adventureworks/Reports/Report Parts

o TargetServerURL: http://mia-sql/sites/adventureworks

o TargetServerVersion: SQL Server 2008 R2 or later

3. On the Build menu, click Deploy AWReports. When deployment has succeeded, minimize Visual
Studio.
4. Start Internet Explorer and browse to http://mia-sql/sites/adventureworks. The first time you
browse to this site, it can take a few minutes to open.

5. In the Quick Launch area on the left, click Reports.


6. In the Reports library, click Internet Sales and note that the report is displayed with the default
parameter value.

7. In the Parameters pane, in the Year drop-down list, clear the 2008 checkbox and select 2007. Then
click Apply. The report is updated to show sales for 2007.

8. On the Actions menu, point to Export and click Excel.

9. When prompted, in the Save drop-down list, click Save As, and save the report as Internet
Sales.xlsx in the D:\Labfiles\Lab10\Starter folder.

10. Wait for the file to be downloaded, then when the download is complete, click Open.

11. View the report in Excel, and then close it without saving any changes to the workbook.

12. Close Internet Explorer.

13. Keep Visual Studio open for the next exercise.

Results: At the end of this exercise, the sales report will include a parameter named Year, and two new
datasets to retrieve the available and default values for the parameter.

Exercise 3: Using Data Bars and Sparklines


Task 1: Add an Existing Report
1. In Visual Studio, in the AWReports solution, in Solution Explorer, right-click Reports, point to Add,
and click Existing item.

2. In the Add Existing Item AWReports dialog box, browse to the D:\Labfiles\Lab10\Starter folder
and select Sales Trends.rdl. Then click Add.

3. In Solution Explorer, double-click Sales Trends.rdl to open it in Report Designer.

4. In the Report Data pane, expand Datasets and note that this report includes a dataset named Sales.

5. Right-click the Sales dataset and click Dataset Properties. Then in the Dataset Properties dialog
box, on the Parameters tab, note that this dataset includes a parameter named CalendarYear, and
click Cancel.

6. In the Report Data pane, expand Parameters and note that the Sales Trends report includes a report
parameter that is mapped to the CalendarYear dataset parameter.
MCT USE ONLY. STUDENT USE PROHIBITED
L10-5

Task 2: Configure Datasets and Parameters


1. In the Report Designer pane, click the Internet Sales.rdl tab to switch to the Internet Sales report.

2. With the Internet Sales report open, in the Report Data pane, in the Datasets folder, right-click
SalesYears and click Convert to Shared Dataset.

3. Repeat the previous step to convert the MaxYear dataset to a shared dataset.

4. In the Report Designer pane, click the Sales Trends.rdl tab to return to the Sales Trends report.
5. In the Report Data pane, right-click Datasets and click Add Dataset.

6. In the Dataset Properties dialog box, change the Name to SalesYears, select the SalesYears shared
dataset, and click OK.

7. Repeat the previous two steps to add a dataset named MaxYear based on the MaxYear shared
dataset to the Sales Trends report.

8. In the Report Data pane, under Parameters, right-click the CalendarYear parameter and click
Parameter Properties.

9. In the Parameter Properties dialog box, in the Available Values tab, select Get values from a
query. Then in the Dataset drop-down list, select SalesYears, in the Value field drop-down list,
select Year, and in the Label field drop-down list, select Year.

10. In the Parameter Properties dialog box, in the Default Values tab, select Get values from a query.
Then in the Dataset drop-down list, select MaxYear, and in the Value field drop-down list, select
MaxYear. Then click OK.

Task 3: Add a Data Bar


1. In the Sales Trends report, in the tablix data region, click the cell to the right of the cell containing the
[ProductCategory] field and above the cell containing the [EnglishMonthName] field. Type Sales
Volume: then format the cell as italic.

2. In the toolbox, drag a Data Bar item and drop it in the cell to the right of the Sales Volume cell.
Then, in the Select Data Bar Type dialog box, select the first style in the Data Bar section (Bar) and
click OK.

3. Click the data bar you just added to display the Chart Data pane, and then in the Values section, add
the SalesAmount field.
4. In the Chart Data pane, in the Values section, click the drop-down arrow next to the SalesAmount
field and click Series Properties. Then, in the Series Properties dialog box, on the Fill tab, set the
following options and click OK:
o Fill style: Gradient

o Color: Light Steel Blue

o Secondary color: Cornflower Blue


o Gradient Style: Left right

5. In the Chart Data pane, in the Values section, click the drop-down arrow next to the SalesAmount
field and click Horizontal Axis Properties. Then, in the Horizontal Axis Properties dialog box,
verify that the Align axes in check box is checked and table1 is selected, and click OK.

6. Click the report design surface under the tablix to hide the Chart Data pane, and then click the cell
containing the data bar to select it and press F4. Then, in the Properties pane, expand Border Style
and set the Default property to None.
MCT USE ONLY. STUDENT USE PROHIBITED
L10-6 Implementing Data Models and Reports with Microsoft SQL Server

7. Click the Preview tab and verify that your report shows a data bar indicating the relative sales
volume for each product category. Then click the Design tab to return to the report designer.

Task 4: Add a Sparkline


1. In the tablix data region of the Sales Trend report, click the cell to the right of the data bar, type
Monthly Trend: then format this cell as italic.

2. In the toolbox, drag a Sparkline item and drop it in the cell to the right of the Monthly Trend cell.
Then, in the Select Sparkline Type dialog box, select the second style in the Area section (Smooth
Area) and click OK.

3. Click the sparkline you just added to display the Chart Data pane, and then in the Values section,
add the SalesAmount field. Then, in the Category Groups section, click the drop-down arrow for the
Details group and select MonthNumberOfYear.

4. In the Chart Data pane, in the Values section, click the drop-down arrow next to the SalesAmount
field and click Series Properties. Then, in the Series Properties dialog box, on the Fill tab, set the
following options and click OK:

o Fill style: Solid

o Color: Cornflower Blue

5. In the Chart Data pane, in the Values section, click the drop-down arrow next to the SalesAmount
field and click Horizontal Axis Properties. Then, in the Horizontal Axis Properties dialog box,
check Align axes in, ensure table1 is selected, and click OK.

6. In the Chart Data pane, in the Values section, click the drop-down arrow next to the SalesAmount
field and click Vertical Axis Properties. Then, in the Vertical Axis Properties dialog box, check
Align axes in, ensure table1 is selected, and click OK.

7. Click the report design surface under the tablix to hide the Chart Data pane, and then click the cell
containing the sparkline to select it and press F4. Then, in the Properties pane, expand Border Style
and set the Default property to None.

8. Click the Preview tab and verify that your report shows a sparkline indicating the relative monthly
sales volume for each product category.

Task 5: Deploy Report Items


1. On the Build menu, click Deploy AWReports. When deployment has succeeded, minimize Visual
Studio.

2. Start Internet Explorer and browse to http://mia-sql/sites/adventureworks. The first time you
browse to this site, it can take a few minutes to open.

3. In the Quick Launch area on the left, click Reports.

4. In the Reports library, click Sales Trend and view the report.

5. Close Internet Explorer.

6. Keep Visual Studio open for the next exercise.

Results: After this exercise, you should have created a report that uses a data bar and a sparkline to show
a visual comparison of sales by product category.
MCT USE ONLY. STUDENT USE PROHIBITED
L10-7

Exercise 4: Using a Map


Task 1: Add an Existing Report
1. In Visual Studio, in the AWReports solution, in Solution Explorer, right-click Reports, point to Add,
and click Existing item.

2. In the Add Existing Item AWReports dialog box, browse to the D:\Labfiles\Lab10\Starter folder
and select US Sales By State.rdl. Then click Add.

3. In Solution Explorer, double-click US Sales By State.rdl to open it in Report Designer.

4. In the Report Data pane, expand Datasets and note that this report includes a dataset named Sales.
5. Right-click the Sales dataset and click Dataset Properties. Then in the Dataset Properties dialog
box, on the Parameters tab, note that this dataset includes a parameter named CalendarYear, and
click Cancel.
6. In the Report Data pane, expand Parameters and note that the report includes a report parameter
that is mapped to the CalendarYear dataset parameter.

7. In the Report Data pane, right-click Datasets and click Add Dataset.
8. In the Dataset Properties dialog box, change the Name to SalesYears, select the SalesYears shared
dataset, and click OK.

9. Repeat the previous two steps to add a dataset named MaxYear based on the MaxYear shared
dataset to the report.

10. In the Report Data pane, under Parameters, right-click the CalendarYear parameter and click
Parameter Properties.

11. In the Parameter Properties dialog box, in the Available Values tab, select Get values from a
query. In the Dataset drop-down list, select SalesYears, in the Value field drop-down list, select
Year, and in the Label field drop-down list, select Year.
12. In the Parameter Properties dialog box, in the Default Values tab, select Get values from a query.
Then in the Dataset drop-down list, select MaxYear, and in the Value field drop-down list, select
MaxYear. Then click OK.
13. Click the Preview tab and note that this report shows sales for each US state for the selected year.
Then click the Design tab.

Task 2: Add a Map


1. In the toolbox, drag a Map item and drop it in the report body to the right of the existing tablix data
region in the US Sales By State report.

2. In the New Map Layer wizard, on the Choose a source of spatial data page, ensure Map gallery is
selected and in the Map Gallery pane, select USA by State Inset. Then click Next.

3. On the Choose spatial data and map view options page, review the default settings and click Next.

4. On the Choose map visualization page, select Color Analytical Map and click Next.

5. On the Choose the analytical dataset page, select Sales and click Next.

6. On the Specify the match fields for spatial and analytical data page, select the checkbox for the
STATENAME field, and in the Analytical Data Fields column, select StateProvinceName. Then click
Next.

7. On the Choose color theme and data visualization page, in the Theme drop-down list, select
Ocean, in the Field to visualize drop-down list, select [Sum(SalesTotal)], and in the Color rule
drop-down list, select White-Blue. Then click Finish.
MCT USE ONLY. STUDENT USE PROHIBITED
L10-8 Implementing Data Models and Reports with Microsoft SQL Server

Task 3: Format the Map


1. Click the map to display the Map Layers pane (you might need to scroll to the right). Then, in the
Map Layers pane, in the PolygonLayer1 drop-down list, click Polygon Properties.

2. In the Map Polygon Properties dialog box, on the General tab, in the Tooltip drop-down list, select
[Sum(SalesTotal)]. Then click OK.

3. In the Map Layers pane, in the PolygonLayer1 drop-down list, click Polygon Color Rule.

4. In the Map Color Rules Properties dialog box, on the General tab, note that the data is visualized
by using color ranges from white to blue. Then, on the Distribution tab, set Number of subranges
to 10 and Range start to 0, and then click OK.

5. Right-click the Title text box above the legend, and click Legend Title Properties. Then in the Map
Legend Title Properties dialog box, change the title text to Sales ($) and click OK.

6. Right-click the Map Title text box and click Title Properties. Then in the Map Title Properties
dialog box, change the title text to Sales by State and click OK.

7. Right-click the color scale at the bottom left of the map and clear the Show Color Scale option.

8. Click the Preview tab and view the report, verifying that the map indicates sales volume in each state
by the shade of the color used to fill the state.

9. Point the mouse to any state that has a blue fill color and verify that the tooltip displayed shows the
sales figure for that state.

Task 4: Deploy Report Items


1. On the Build menu, click Deploy AWReports. When deployment has succeeded, close Visual Studio.

2. Start Internet Explorer and browse to http://mia-sql/sites/adventureworks. The first time you
browse to this site, it can take a few minutes to open.
3. In the Quick Launch area on the left, click Reports.

4. In the Reports library, click US Sales By State and view the report.

5. Close Internet Explorer.

Results: After this exercise, you should have created a report that shows sales by US state on a map.
MCT USE ONLY. STUDENT USE PROHIBITED
L11-1

Module 11: Managing Report Execution and Delivery


Lab: Configuring Report Execution and
Delivery
Exercise 1: Configuring Report Execution
Task 1: Prepare the Lab Environment
1. Start that the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines. Then log on to 20466C-MIA-
SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. In the D:\Labfiles\Lab11\Starter folder, right-click Setup.cmd and click Run as administrator.

3. Click Yes when prompted to confirm you want to run the command file, and wait for the script to
finish.
4. Use Visual Studio to open the AWReports.sln solution in the D:\Labfiles\Lab11\Starter folder. Then
on the Build menu, click Deploy Solution. When deployment is complete, close Visual Studio.

Task 2: Configure a Shared Data Source


1. Start Internet Explorer and browse to SharePoint site at http://mia-sql/sites/adventureworks. Then, in
the quick launch area, click Reports and click Data Sources.

2. Click the ellipsis () for the AdventureWorksDW data source, then in the AdventureWorksDW.rds
information panel, click the ellipsis () and click View Dependent Items.

3. Note that the Internet Sales report has a dependency on this data source. Then click Close.

4. Click the ellipsis () for the AdventureWorksDW data source, then in the AdventureWorksDW.rds
information panel, click the ellipsis () and click Edit Data Source Definition.
5. In the Credentials section of the configuration page, note that the data source is currently
configured to use the Windows authentication (Integrated) or SharePoint user option.

6. In the Credentials section, select the Stored credentials option and enter the following credentials:
o User Name: ADVENTUREWORKS\ServiceAcct

o Password: Pa$$w0rd

7. Select Use as Windows credentials and click Test Connection. Then when the connection has been
tested successfully, click OK.

Task 3: Configure Report Caching


1. In the SharePoint site in Internet Explorer, click the Reports document library.

2. Click the ellipsis () for the Internet Sales report, then in the Internet Sales.rdl information panel,
click the ellipsis () and click Manage Processing Options.

3. In the Data Refresh Options section, select the Use cached data option, and then in the Cache
Options section, select On a custom schedule and click Configure.
4. In the Frequency section, select Day, in the Schedule section select all the days and set the Start
time to 12:00 am, and click OK. Then click OK again to set the processing options and return to the
Reports folder.
5. Click Internet Sales and note the execution date and time under the report heading.
MCT USE ONLY. STUDENT USE PROHIBITED
L11-2 Implementing Data Models and Reports with Microsoft SQL Server

6. At the top of the report page, click the Reports link to return to the Reports folder, and then click
Internet Sales again. Note that the execution date and time have not changed because the report
has been cached.

7. Keep Internet Explorer open for the next exercise.

Results: After this exercise, you should have configured a shared data source to use stored credentials,
and configured a report to display a cached instance.

Exercise 2: Implementing a Standard Subscription


Task 1: Subscribe to a Report
1. With the Internet Sales report displayed in the browser, on the Actions menu, click Subscribe.

2. In the Delivery Extension drop-down list, ensure E-Mail is selected.

3. In the Delivery Options section, enter the following options:


o To: student@adventureworks.msft.

o Comment: The sales report is attached.

4. In the Report Contents section, ensure that Show report inside message is selected and in the
Format drop-down list, select Excel.

5. In the Delivery Event section, ensure that On a custom schedule is selected and click Configure.
Then define a custom schedule that will send the report every day in two minutes from the current
time and click OK. You can determine the current system time by starting a command prompt
window and entering the command time /T.

6. In the Parameters section, ensure that Use Report Default Value is selected. Then click OK.

Task 2: Verify the Subscription


1. At the top of the report page, click the Reports link to return to the Reports folder.

2. Click the ellipsis () for the Internet Sales report, then in the Internet Sales.rdl information panel,
click the ellipsis () and click Manage Subscriptions.
3. Wait two minutes and then refresh the page. The Last Results column should indicate that the
subscription has run and the report was sent as an email message. Then minimize Internet Explorer.

4. View the contents of the C:\intepub\mailroot\Drop folder and double-click the email message that
has been received by the local SMTP server to open it in Microsoft Outlook.

5. Read the email message and open the attached Excel file to view the report. Then close Excel and the
email message.

Results: After this exercise, you should have created a standard subscription that delivers a report by
email.
MCT USE ONLY. STUDENT USE PROHIBITED
L11-3

Exercise 3: Implementing a Data-Driven Subscription


Task 1: Create a Table of Subscription Data
1. In the D:\Labfiles\Lab11\Starter folder, double-click Subscription Table.sql to open it in SQL Server
Management Studio. Each time you are prompted, use Windows authentication to connect to the
database engine on the localhost server.

2. View the Transact-SQL code and note that it creates and populates a table named
ReportSubscriptions, which contains the following columns:

o SubscriptionID a unique primary key.

o RecipientEmail the email address of a subscription recipient.

o ReportFormat the format in which the report should be rendered.

o Linked a Boolean value that indicates whether the subscription email message should include a
link to the report on the report server.

3. Click Execute to run the query. Then when it has completed, close SQL Server Management Studio.

Task 2: Create a Data-Driven Subscription


1. Maximize Internet Explorer, which should be open at the Manage Subscriptions page for the
Internet Sales report.

2. Click Add Data-Driven Subscription.

3. In the Description text box, type Weekly Sales Report.


4. In the Connection Type section, select Shared data source.

5. In the Data Source Link section, click the ellipsis () button, and then in the Select an Item dialog
box, click Data Sources, select the AdventureWorksDW data source and click OK.
6. In the Query section, type the following query and click Validate. When the query is validated
successfully, click Next:

SELECT * FROM ReportSubscriptions

7. In the Year section, ensure that Use report default value is selected, and click Next.
8. In the Delivery Type section, ensure that E-Mail is selected. Then set the following configuration
values and click Next.

o To: Select a value from the database (select RecipientEmail).

o Include Report: True.

o Render Format: Select a value from the database (select ReportFormat).

o Subject: Specify a static value (enter Weekly sales report).

o Comment: Specify a static value (enter The weekly sales report is attached).

o Include Link: Select a value from the database (select Linked).

9. In the Delivery Event section, ensure that On a custom schedule is selected.

10. In the Frequency section, select Day.


MCT USE ONLY. STUDENT USE PROHIBITED
L11-4 Implementing Data Models and Reports with Microsoft SQL Server

11. In the Schedule section, select the current day and enter a time that is two minutes later than the
current time. You can determine the current system time by starting a command prompt window and
entering the commands and time /T. You can also use the command echo %date% to determine the
current day and date.

12. Click Finish and view the subscription details.

Task 3: Verify the Subscription


1. Wait for two minutes and then refresh the page. When the subscription has been processed, the Last
Results column should contain the message Done: 3 processed of 3 total; 0 errors.

2. View the contents of the C:\inetpub\mailroot\Drop folder and note the email messages that have
been received by the local SMTP server.

3. Double-click each of the email messages to view them in Microsoft Outlook.

4. Close email messages, and folder windows. Then close Internet Explorer.

Results: After this exercise, you should have created a data-driven subscription that delivers a report to
multiple recipients, in multiple formats by email.
MCT USE ONLY. STUDENT USE PROHIBITED
L12-1

Module 12: Delivering BI with SharePoint PerformancePoint


Services
Lab: Implementing a SharePoint Server BI
Solution
Exercise 1: Creating a SharePoint Server Site for BI
Task 1: Prepare the Lab Environment
1. Ensure that the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines are both running, and then
log on to 20466C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. In the D:\Labfiles\Lab12\Starter folder, right-click Setup.cmd, and then click Run as administrator.

3. When prompted to confirm that you want to run the command file, click Yes and then wait for the
script to finish. If the script prompts you to confirm the deletion of a SharePoint site, press Enter. The
script may take a few minutes to complete.

Task 2: Enable SharePoint Publishing


1. Start Internet Explorer and browse to http://mia-sql/sites/adventureworks. The first time you do this,
it may take a few minutes to open.

2. In the title bar for the home page, next to Student, click the Settings icon and in the menu, click Site
settings.

3. On the Site Settings page, under Site Collection Administration, click Site collection features.

4. On the Site Collection Features page, in the SharePoint Server Publishing Infrastructure row, if
the feature is not already active, click Activate, and then wait for the Active indicator to appear.
Note: The feature can take a few minutes to activate.

5. At the top of the Site Collection Features page, click Site Settings to return to the Site Settings
page.
6. Under Site Actions, click Manage site features.

7. On the Site Features page, in the SharePoint Server Publishing row, if the feature is not already
active, click Activate, and then wait for the Active indicator to appear.
8. At the top of the Site Features page, click Adventure Works Portal to return to the home page.

Task 3: Create a Subsite


1. On the Adventure Works Portal home page, in the Quick Launch pane on the left, click Site
Contents.

2. At the bottom of the Site Contents page, click new subsite.

3. On the New SharePoint Site page, under Title, in the text box, type Adventure Works BI Portal.

4. Under Description, in the text box, type A subsite for BI.

5. In the URL name text box, type bi, so that the URL for the new subsite is http://mia-
sql/sites/adventureworks/bi.

6. In the Template Selection area, under Select a template, on the Enterprise tab, click Business
Intelligence Center.
MCT USE ONLY. STUDENT USE PROHIBITED
L12-2 Implementing Data Models and Reports with Microsoft SQL Server

7. At the bottom of the page, click the Create button. After a short time, the Adventure Works BI Portal
site is displayed.

8. Select the URL in the Internet Explorer navigation bar, right-click it, and then click Copy.

9. In the Quick Launch area, click Home.

10. On the home page, under the Quick Launch area, click Edit Links, and then click link.
11. In the Add a link dialog box, in the Text to display box, type BI Portal, right-click the Address box,
click Paste, and then click OK.

12. Under link, click Save.

13. In the Quick Launch area, click the new BI Portal link and verify that the Adventure Works BI Portal
site is displayed.

14. In the Adventure Works BI Portal, in the title bar for the home page, next to Student, click the
Settings icon, and then click Site settings.

15. On the Adventure Works BI Portal Site Settings page, under Look and Feel, click Navigation.

16. On the Navigation Settings page, in the Current Navigation section, select Structural Navigation:
Display only the navigation items below the current site. At the top of the page, click OK. Note
that the Quick Launch area now only shows links for the items in the BI Portal subsite, and not for
items in the parent site.
17. Click the image above the Quick Launch area. This provides a navigation link to the home page of the
subsite.

18. Close Internet Explorer.

Results: At the end of this exercise, you should have created a subsite based on the Business Intelligence
Center template at http://mia-sqlbi/sites/adventureworks/bi.

Exercise 2: Configuring PerformancePoint Data Access


Task 1: Configure the PerformancePoint Unattended Account
1. On the Start screen, type SharePoint and start the SharePoint 2013 Central Administration app.
When prompted to allow the program to make changes, click Yes.

2. In SharePoint Central Administration, under Application Management, click Manage service


applications.

3. In the list of service applications, click PerformancePoint Services Application. Make sure you click
the link for the application, and not the link for its proxy.

4. On the Manage PerformancePoint Services page, click PerformancePoint Service Application


Settings.

5. On the PerformancePoint Service Application Settings page, ensure that Unattended Service
Account is selected, and if a Change User button is present, click it.

6. Enter the following user credentials, and then click OK:


o User Name: ADVENTUREWORKS\ServiceAcct

o Password: Pa$$w0rd

7. Close the SharePoint Central Administration application.


MCT USE ONLY. STUDENT USE PROHIBITED
L12-3

Task 2: Create a Data Source


1. Start Internet Explorer and browse to http://mia-sql/sites/adventureworks. In the Quick Launch area,
click BI Portal.

2. In the Quick Launch area, click Data Connections.

3. On the Ribbon, on the PerformancePoint tab, click Dashboard Designer.

4. In the Internet Explorer prompt to open the file, click Open. If the Application Run Security
Warning dialog box is displayed, click Run.

Note: The Dashboard Designer can take a few minutes to open.

5. In the Dashboard Designer, in the Workspace Browser pane, right-click Data Connections, and then
click New Data Source.

6. In the Select a Data Source Template dialog box, under Template, click Analysis Services, and
then click OK.
7. When the new data source is created, rename it to Adventure Works OLAP.

8. Under Connection Settings, in the Server text box, type MIA-SQL, in the Database drop-down list,
click Adventure Works OLAP, and in the Cube drop-down list, click Sales.
9. On the Adventure Works OLAP page, click the Time tab.

10. In the Time Dimension drop-down list, click Order Date.Order Date.Calendar Date.

11. In the Choose a date to begin the year box for the selected time dimension, click Browse.

12. In the Select Members dialog box, select the 1st of January 2008, and then click OK.

13. In Hierarchy level list, click Day.

14. In the Enter a date that is equal to the period specified by the reference member above list,
select January 1st of the current year.

15. In the Time Member Associations pane, create the following mappings:

o Calendar Year: Year

o Calendar Semester: Semester

o Calendar Quarter: Quarter

o Month: Month

o Day: Day

16. In the Workspace Browser pane, right click Adventure Works OLAP, and then click Save.

Task 3: Create a Filter


1. In Dashboard Designer, in the Workspace Browser pane, click PerformancePoint Content. On the
ribbon, on the Create tab, click Filter.

2. In the Select a Filter Template dialog box, select Time Intelligence, and click OK.

3. On the Select a data source page, click Add Data Source, select the Adventure Works OLAP data
source and click OK, and then click Next.

4. On the Enter Time Formula page, in the Formula column, type Year, and in the Display Name
column, type Current Year.

5. In the new empty row under the values you entered, in the Formula column, type Year-1, and in the
Display Name column, type Last Year.
MCT USE ONLY. STUDENT USE PROHIBITED
L12-4 Implementing Data Models and Reports with Microsoft SQL Server

6. Click Preview, and note the dimension members from the Adventure Works OLAP data source that
are mapped to these functions. Then click Close, and on the Enter Time Formula page, click Next.

7. On the Select Display method page, select List, and click Finish.

8. Rename the new filter to Sales Year.

Results: At the end of this exercise, you will have configured the credentials used by PerformancePoint
Services, created a PerformancePoint data source for the Adventure Works OLAP database, and created a
filter that will enable users to display BI data for the current year or the previous year.

Exercise 3: Creating PerformancePoint Reports


Task 1: Create an Analytic Chart Report
1. In Dashboard Designer, on the ribbon, on the Create tab, click Analytic Chart.

2. In the Create an Analytic Chart Report dialog box, on the Workspace tab, click Adventure Works
OLAP, and then click Finish.

3. When the chart is created, rename it to Reseller Profit.

4. In the Details pane on the right of the screen, expand Measures.

5. Drag the Reseller Profit measure to the Bottom Axis area.

6. In the Details pane, expand Dimensions.

7. Drag the Sales Territory dimension to the Series area.

8. In the Series area, click the Sales Territory drop-down arrow. Then, in the Select Members dialog
box, select only Europe, North America, and Pacific, and click OK.

9. In the Details pane, under Dimensions, expand Order Date.


10. Drag the Calendar Date dimension hierarchy to the Background area.

11. On the ribbon, on the Edit tab, in the Legend drop-down list, click Show Legend at Top.

12. In the Workspace Browser pane, right-click Reseller Profit, and then click Save.

Task 2: Create a Reporting Services Report


1. In Dashboard Designer, on the ribbon, on the Create tab, click Reporting Services.

2. Rename the new report to Reseller Revenue.

3. In the Reseller Revenue pane, in the Server mode drop-down list, select SharePoint Integrated.
4. In the SharePoint Site box, type http://mia-sql/sites/adventureworks.

5. In the Document Library drop-down list, select Reports.

6. In the Report drop-down list, select Reseller Revenue.rdl.

7. Clear the Show toolbar check box, and ensure that Show parameters is not selected. In the Report
parameters table, note that the Year parameter has been set to use its default value.

8. In the Workspace Browser pane, right-click Reseller Revenue, and then click Save.
MCT USE ONLY. STUDENT USE PROHIBITED
L12-5

Results: At the end of this exercise, you will have created an analytic chart named Reseller Profit, and a
PerformancePoint report, based on SQL Server Reporting Services, named Reseller Revenue.

Exercise 4: Creating a PerformancePoint Scorecard


Task 1: Create a Scorecard
1. In Dashboard Designer, in the Workspace Browser pane, click PerformancePoint Content. On the
ribbon, on the Create tab, click Scorecard.

2. In the Select a Scorecard Template dialog box, select Analysis Services, and click OK.

If the Select a Scorecard Template dialog box does not appear, delete the new scorecard that is
added, click the Office button and click Designer Options, and select the Use wizards to create
scorecards check box. Then click Save and repeat this step.

3. On the Select a data source page, select the Adventure Works OLAP data source and click Next.

4. On the Select a KPI Source page, select Import SQL Server Analysis Services KPIs, and click Next.
5. On the Select KPIs to Import page, select the Reseller Margin KPI, and click Next.

6. On the Add Measure Filters page, click Next.

7. On the Add Member Columns page, click Next.

8. On the Locations page, click Finish.

9. When the scorecard is added, rename it to Reseller Scorecard.

Task 2: Edit a KPI


1. In the Workspace Browser pane, select the Reseller Margin KPI.
2. In the Value row, in the Number Format column, click (Default). Then in the Format Numbers
dialog box, in the Format drop-down list, select Percentage, and click OK.

3. Repeat the previous step for the Goal and Status, and Trend rows.

Task 3: Edit a Scorecard


1. In the Workspace Browser pane, select Reseller Scorecard.

2. On the ribbon, on the Edit tab, click Update. The scorecard is updated to reflect the changes you
made to the KPI.
3. In the Details pane, expand Dimensions, and then drag Sales Territory to the right edge of the
Reseller Margin cell.

4. In the Select Members dialog box, select Europe, North America, and Pacific, and then click OK.

5. On the ribbon, on the Edit tab, click Update. The scorecard is updated to show the KPI indicator for
each sales region group.

6. Expand Europe and note that the KPI is shown for each level of the hierarchy. The Trend value will
not change, because it compares the current time period with the previous time period, and no
specific time period has been specified for the scorecard.

7. Collapse Europe.

8. Click the Goal and Status column header, and then on the Edit tab of the ribbon, click Metric.

9. On the Target Settings dialog box, in the Data Value list, select No Value, and in the Additional
data value list, select Score. Then click OK. The scorecard is updated to show only the score for each
KPI icon, representing the relative percentage of the KPI value to the target.
MCT USE ONLY. STUDENT USE PROHIBITED
L12-6 Implementing Data Models and Reports with Microsoft SQL Server

10. Click the Trend column header, and then on the Edit tab of the ribbon, click Hide. This column will
not be displayed in SharePoint Server.

11. On the Edit tab of the ribbon, click Settings.

12. In the View Settings dialog box, on the Toolbar tab, select Show scorecard toolbar. Then click OK.
The toolbar is not shown in the designer, but will be visible when the scorecard is published to
SharePoint server.

Results: At the end of this exercise, you will have added a KPI named Reseller Margin and a scorecard
named Reseller Scorecard to the Dashboard Designer.

Exercise 5: Creating a PerformancePoint Dashboard


Task 1: Create a Dashboard
1. In Dashboard Designer, on the ribbon, on the Create tab, click Dashboard.
2. In the Select a Dashboard Page Template dialog box, click 2 Columns, and then click OK.

3. When the new dashboard is created, rename it to Reseller Dashboard.

4. In the Pages pane, select Page 1 and rename it to Reseller Performance.


5. In the Details pane, expand Filters, and then expand PerformancePoint Content.

6. Drag the Sales Year filter to the Left Column area of the dashboard.

7. In the Details pane, expand Reports, and then expand PerformancePoint Content.

8. Drag the Reseller Profit report to the Left Column area of the dashboard, directly under the Sales
Year filter.

9. Drag the Reseller Revenue report to the Right Column area of the dashboard.
10. In the Details pane, expand Scorecards, and then expand PerformancePoint Content.

11. Drag Reseller Scorecard to the Left Column area of the dashboard, directly under the Reseller
Profit report.

Task 2: Create Connections


1. In the Left Column area, in the Reseller Profit drop-down list, click Create Connection.

2. In the Connection dialog box, in the Get value from list, select Left Column (1) Sales Year, and
on the Values tab, in the Connect to list, select Order Date.Calendar Date, and in the Source value
list, select Member Unique Name. Then click OK.

3. In the Left Column area, in the Reseller Scorecard drop-down list, click Create Connection.

4. In the Connection dialog box, in the Get value from list, select Left Column (1) Sales Year, and
on the Values tab, in the Connect to list, select TI Formula, and in the Source value list, select
Formula. Then click OK.

5. In the Right Column area, in the Reseller Revenue drop-down list, click Create Connection.
6. In the Connection dialog box, in the Get value from list, select Left Column (1) Sales Year, and
on the Values tab, in the Connect to list, select Year, and in the Source value list, select Display
Value. Then click OK.

Task 3: Deploy a Dashboard


1. In the Workspace Browser pane, right-click Untitled Workspace, and then click Save.
MCT USE ONLY. STUDENT USE PROHIBITED
L12-7

2. Save the workspace as Reseller Workspace in the D:\Labfiles\Lab12\Starter folder.

3. In the Workspace Browser pane, right-click Reseller Dashboard, and then click Deploy to
SharePoint.

4. In the Deploy To dialog box, expand Adventure Works BI Portal, click Dashboards, and then click
OK. The dashboard is uploaded to SharePoint Server and opened in a new tab in Internet Explorer.

5. In Internet Explorer, on the ribbon, on the Page tab, click Make Homepage. When prompted to
confirm the action, click OK.

6. Close the Internet Explorer tab that contains the dashboard, and in the remaining tab (which should
be displaying the Data Connections library), click the Browse tab, and then click the image above
the Quick Launch area to go to the sites home page (which is now the dashboard you created).

7. View the dashboard, noting the information it contains. Note that at smaller screen resolutions, you
may need to scroll within each dashboard element to see all the data.

Task 4: Browse a Dashboard


1. In the dashboard page, in the Sales Year drop-down list, select Last Year. Note that the Reseller
Profit, Reseller Scorecard, and Reseller Revenue items are updated to reflect results from the
previous year (which is calculated to be 2007). The Reseller Revenue report might take longer to
refresh.

2. In the Reseller Profit report, hold the mouse over the central column (which represents North
America) and view the details in the tooltip. Then click the central column and note that the chart
updates to show details for North America.

3. Click the column for United States, and then view the profit for the regions in that territory.
4. Click the drop-down arrow at the upper-right of the chart, and then click Reset View to return to the
default chart view for all sales territories.

5. In the Reseller Scorecard area, expand the sales regions to view the sales performance in each
territory hierarchy.

6. In the Reseller Revenue report, expand the sales regions to view monthly revenue in those regions.

7. Close Internet Explorer and close the Dashboard Designer.

Results: At the end of this lab, you will have created a dashboard named Reseller Dashboard and
published it to SharePoint Server.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
L13-1

Module 13: Performing Predictive Analysis with Data Mining


Lab: Using Data Mining to Support a
Marketing Campaign
Exercise 1: Using Table Analysis Tools
Task 1: Prepare the Lab Environment
1. Start the 20466C-MIA-DC and 20466C-MIA-SQL virtual machines, and then log on to 20466C-MIA-
SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.

2. In the D:\Labfiles\Lab13\Starter folder, right-click Setup.cmd and click Run as administrator.

3. Click Yes when prompted to confirm you want to run the command file, and wait for the script to
finish.

Task 2: Enable the Data Mining Add-Ins in Excel


1. Start Excel and create a new blank workbook.

2. On the ribbon, click File. Then click Options.

3. In the Excel Options dialog box, on the Add-Ins page, in the Manage list, select COM Add-Ins and
click Go.

4. In the COM Add-Ins dialog box, if SQLServer.DMClientXLAddin and SQLServer.DMXLAddin are


not selected, select them both. Then click OK.
5. Close Excel.

Task 3: Perform Table Analysis


1. Start Excel and open the Customer Data For Data Mining.xlsx workbook in the
D:\Labfiles\Lab13\Starter folder.

2. Select any cell in the table of data, and then on the ribbon, in the Table Tools group, click the
Analyze tab.

3. In the Connection area, click the connection icon (which is labeled with the name of the last
connection used or <No Connection>). Then in the Analysis Services Connections dialog box, click
New.

4. In the Connect to Analysis Services dialog box, in the Server name field, type MIA-SQL, in the
Catalog name drop-down list, click DMAddinsDB, and then click OK.

5. In the Analysis Services Connections dialog box, click Close.

6. On the ribbon, click Analyze Key Influencers.


7. In the SQL Server Data Mining - Analyze Key Influencers dialog box, in the Column Selection
drop-down list, click Purchased Bike, and then click Run.

8. In the SQL Server Data Mining Discrimination based on key influencers dialog box, in the
Compare Value 1 drop-down list, click Yes, and then in the to Value 2 drop-down list, click No.

9. Click Add Report, and then click Close.

10. Review the Key Influencers Report for Purchased Bike report. And note the values that most
strongly correlate with a customer purchasing a bike.

11. Save the workbook and close Excel.


MCT USE ONLY. STUDENT USE PROHIBITED
L13-2 Implementing Data Models and Reports with Microsoft SQL Server

Results: After this exercise, you should have created a Key Influencers report in Excel.

Exercise 2: Creating a Data Mining Structure


Task 1: Create a Data Mining Project
1. Start Visual Studio, and on the File menu, point to New, and click Project.

2. In the New Project dialog box, click Analysis Services Multidimensional and Data Mining Project,
in the Name field, type AW Data Mining, in the Location field, browse to the
D:\Labfiles\Lab13\Starter folder and click Select Folder, and then click OK.

3. In Solution Explorer, right-click Data Sources, and then click New Data Source.

4. In the Data Source Wizard, on the Welcome to the Data Source Wizard page, click Next.

5. On the Select how to define the connection page, click New. Then, in the Connection Manager
dialog box, in the Server name field, type MIA-SQL, in the Select or enter a database name drop-
down list, click AdventureWorksDW, and click OK.

6. In the Data Source Wizard, on the Select how to define the connection page, click Next.

7. On the Impersonation Information page, click Use a specific Windows user name and password,
in the User name field, type ADVENTUREWORKS\ServiceAcct, in the Password field, type
Pa$$w0rd, and then click Next.

8. On the Completing the Wizard page, click Finish.


9. In Solution Explorer, right-click Data Source Views, and then click New Data Source View.

10. In the Data Source View Wizard, on the Welcome to the Data Source View Wizard page, click
Next.
11. On the Select a Data Source page, ensure that Adventure Works DW is selected, and then click
Next.

12. On the Select Tables and Views page, in the Available objects list, click ProspectiveBuyer (dbo),
hold the Ctrl key, click, vTargetMail (dbo) and, click the > button to move the selected objects to
the Included objects list, and then click Next.

13. On the Completing the Wizard page, in the Name field, type Adventure Works DW DM View,
and then click Finish.

Task 2: Create a Data Mining Structure and a Data Mining Model


1. In Solution Explorer, right-click Mining Structures, click New Mining Structure.

2. In the Data Mining Wizard, on the Welcome to the Data Mining Wizard page, click Next.
3. On the Select the Definition Method page, ensure that From existing relational database or data
warehouse is selected, and then click Next.

4. On the Create the Data Mining Structure page, ensure that Create mining structure with mining
model is selected and select the Microsoft Decision Trees data mining technique. Then click Next.

5. On the Select a Data Source View page, select Adventure Works DW DM View and click Next.

6. On the Specify Table Types page, in the vTargetMail row, select the check box in the Case column,
and then click Next.
MCT USE ONLY. STUDENT USE PROHIBITED
L13-3

7. On the Specify the Training Data page, in the Mining model structure table, in the BikeBuyer
row, select the check box in the Predictable column. In the CustomerKey row, select the check box
in the Key column, for all other rows, select the Input check box, and then click Next.

8. On the Specify Columns Content and Data Type page, click Detect. Ensure that the content type
of the Bike Buyer column is identified as Discrete, and change the Yearly Income column to
Discrete. Then click Next.

9. On the Create Testing Set page, note that the Percentage of data for testing value is 30 percent,
and then click Next.
10. On the Completing the Wizard page, in the Mining structure name field, type Purchase
Prediction In the Mining model name field, type Purchase Decision Tree, and then click Finish.

11. On the menu bar, click Build, and then click Deploy AW Data Mining.

12. When deployment is complete, close Visual Studio.

Results: After this exercise, you should have created a data mining structure and a data mining model.

Exercise 3: Adding a Data Mining Model to a Data Mining Structure


Task 1: Add a Model to the Data Mining Structure
1. Start Excel and create a new blank workbook.

2. On the ribbon, on the Data Mining tab, in the Connection area (or drop-down list, depending on
the screen resolution), click the connection icon (which is labeled with the name of the last
connection used or <No Connection>), and then in the Analysis Services Connections dialog box,
click New.
3. In the Connect to Analysis Services dialog box, in the Server name field, type MIA-SQL, in the
Catalog name drop-down list, click AW Data Mining, and then click OK.

4. In the Analysis Services Connections dialog box, click Close.

5. On the ribbon, in the Data Modeling area, click Advanced, and then click Add Model to Structure.

6. In the Add Model to Structure Wizard, on the Getting Started with the Add Model to Structure
Wizard page, click Next.
7. On the Select Structure or Model page, ensure that the Purchase Prediction structure is selected,
and then click Next.

8. On the Select Mining Algorithm page, in the Algorithm drop-down list, click Microsoft Naive
Bayes, and then click Next.

9. On the Select Columns page, change the Usage value for the following columns and click Next:

o Bike Buyer: Predict Only

o Name Style: Do not use

10. On the Finish page, click Finish.

Task 2: Review the Data Mining Model


1. In the Browse dialog box, on the Dependency Network tab, review the diagram showing the factors
that correlate with Bike Buyer, and then under All Links, move the slider down one notch to remove
the weakest correlations from the diagram.

2. Repeat step 1 until the diagram shows only the strongest correlations.
MCT USE ONLY. STUDENT USE PROHIBITED
L13-4 Implementing Data Models and Reports with Microsoft SQL Server

3. Click the Attribute Profiles tab, and review the attribute profile graphics.

4. Click the Attribute Characteristics tab, in the Value drop-down list click 1, and then review the
information in the Characteristics for 1 table.

5. Click the Attribute Discrimination tab, in the Value 1 drop-down list, click 1, in the Value 2 drop-
down list, click 0, and then review the information in the Discrimination scores for 1 and 0 table.
Maximize the window if necessary.

6. In the Browse dialog box, click Close.

Results: After this exercise, you should have created a Naive Bayes data mining model.

Exercise 4: Validating Data Mining Models


Task 1: Create an Accuracy Chart
1. In Excel, on the ribbon, on the Data Mining tab, in the Accuracy and Validation area, click
Accuracy Chart.

2. In the Accuracy Chart Wizard, on the Getting Started with the Accuracy Chart Wizard page, click
Next.

3. On the Select Structure or Model page, select Purchase Prediction and click Next.

4. On the Specify Column to Predict and Value to Predict page, in the Mining column to predict
drop-down list, ensure that Bike Buyer is selected. In the Value to predict drop-down list, click 1,
and then click Next.

5. On the Select Source Data page, ensure that Test data from mining structure is selected, and then
click Finish.
6. In the Accuracy Chart for Structure Purchase Prediction chart, review the data for Purchase
Bayes, Purchase Decision Tree, Ideal Model, and No Model.

Note: If the chart does not show all the data, select it and, on the Design tab of the ribbon, click Select
Data. Then in the Select Data Source dialog box, click OK.

Task 2: Create a Classification Matrix


1. On the ribbon, on the Data Mining tab, in the Accuracy and Validation area, click Classification
Matrix.

2. In the Classification Matrix Wizard, on the Getting Started with the Classification Matrix Wizard
page, click Next.

3. On the Select Structure or Model page, select Purchase Prediction and click Next.

4. On the Specify Column to Predict page, ensure that Bike Buyer is selected and click Next.

5. On the Select Source Data page, ensure that Test data from mining structure is selected, and then
click Finish.
6. In the Counts of correct/incorrect classification for structure Purchase Prediction report,
review the data for Purchase Bayes and Purchase Decision Tree.

Task 3: Create a Profit Chart


1. On the ribbon, on the Data Mining tab, in the Accuracy and Validation area, click Profit Chart.

2. In the Profit Chart Wizard, on the Getting Started with the Profit Chart Wizard page, click Next.
MCT USE ONLY. STUDENT USE PROHIBITED
L13-5

3. On the Select Structure or Model page, select Purchase Prediction and click Next.

4. On the Specify Profit Chart Parameters page, set the following values and then click Next:

o Mining column to predict: Bike Buyer

o Value to predict: 1

o Target Population: 5,000

o Fixed cost: 500.00

o Individual cost: 1.00

o Revenue per individual: 150.00


5. On the Select Source Data page, ensure that Test data from mining structure is selected, and then
click Finish.

6. In the Profit Chart for Structure Purchase Prediction chart, review the data for Purchase Bayes
and Purchase Decision Tree.

Note: If the chart does not show all the data, select it and on the Design tab of the ribbon, click Select
Data. Then in the Select Data Source dialog box, click OK.

7. Save the workbook as DM Validation.xlsx in the D:\Labfiles\Lab13\Starter folder, and then close
Excel.

Results: After this exercise, you should have validated the data mining models by using the Data Mining
Add-in for Excel.

Exercise 5: Using a Data Mining Model in a Report


Task 1: Create a report
1. Start Visual Studio, and on the File menu, point to New and click Project.

2. In the New project dialog box, select Report Server Project Wizard. In the Name field, type
Promotion Targeting, in the Location field, browse to the D:\Labfiles\Lab13\Starter folder and click
Select Folder, and then click OK.

3. In the Report Wizard, on the Welcome to the Report Wizard page, click Next.

4. On the Select the Data Source page, select New data source, change the Name to
AWDataMining, in the Type drop-down list, select Microsoft SQL Server Analysis Services, and
click Edit.

5. In the Connection Properties dialog box, in the Server name field, type MIA-SQL, in the Select or
enter a database name drop-down list, click AW Data Mining, and then click OK.

6. On the Select the Data Source page, click Next.

7. On the Design the Query page, click Query Builder.

8. In Query Designer, in the Mining Model pane, click Select Model. Then in the Select Mining
Model dialog box, expand Purchase Prediction, click Purchase - Bayes, and click OK.

9. In the Select Input Table(s) pane, click Select Case Table. Then in the Select Table dialog box, click
ProspectiveBuyer (dbo), and click OK.

10. In the table at the bottom of the query designer, in the Source column, select ProspectiveBuyer
table, and then in the Field column, in the drop-down list, click FirstName.
MCT USE ONLY. STUDENT USE PROHIBITED
L13-6 Implementing Data Models and Reports with Microsoft SQL Server

11. Repeat step 10 to add five more rows to the table by using the settings in the following table, and
then click OK:

Source Field Alias Criteria/Argument

ProspectiveBuyer LastName
table

ProspectiveBuyer Address Line 1


table

ProspectiveBuyer City
table

Purchase Bayes Bike Buyer =1


mining model

Prediction Function PredictProbability Purchase Probability [Purchase -


Bayes].[Bike Buyer]

12. On the Design the Query page, click Next.

13. On the Select the Report Type page, select Tabular, and click Next.
14. On the Design the Table page, move all the columns into the Details area, and click Next.

15. On the Choose the Table Style page, select any style and click Next.

16. On the Choose the Deployment Location page, click Next.


17. On the Completing the Wizard page, change the Report name to Potential Bike Buyers and click
Finish.

18. In the report designer, on the Design tab, click the table.

19. In the Row Groups pane, click the (table1_Details_Group) drop-down list and click Group
Properties.

20. On the Sorting tab, click Add, in the Sort by drop-down list, click [Purchase_Probability], and in the
Order drop-down list, click Z to A. Then click OK.

21. In the Purchase Probability column, right-click [Purchase Probability] and click Text Box
Properties. Then in the Text Box Properties dialog box, click Number, select Percentage, and click
OK.

22. Click the Preview tab to review the Potential Bike Buyers report. If a console window is displayed,
minimize it.

Results: After this exercise, you should have created a report that predicts bike purchasers.

You might also like