Вы находитесь на странице: 1из 188

Analyzing PI System Data

Version 2015

How to Use this Workbook


Each Main Heading describes a
high-level valuable learning topic.

Your objectives are skills you can


expect to learn in this segment.

New concepts are presented as


level 2 headings.

Throughout the class you will be


presented with questions and
challenges to help you learn.

The majority of your time will be


spent learning new skills via handson exercises, either in small groups
or on your own.

Icons help you identify themes, like


exercises, tools, tips, or
documentation references.

User manuals, Learning workbooks, and other materials used in class can be downloaded
from http://techsupport.osisoft.com . Login to an OSIsoft technical support account is
required.

Page i

Analyzing PI System Data

Software Versions Used in this Document


The list below describes the software versions used in this version of the course.

ii

Software

Version

PI DataLink

2015

Microsoft Excel

2013

PI ProcessBook

2014

PI OLEDB Enterprise

2012

Microsoft SQL

2008R2, 2012

PI Data Archive

2015

PI Asset Framework

2015

PI Coresight

2014

Contents
1.

Welcome ................................................................................................................................ 5
1.1 Course Environment ...................................................................................................... 5
1.2 Review PI System Architecture ..................................................................................... 6

1.2.1 The PI System Described

1.2.2

1.3

Architecture of a Typical PI System

Assets and Tags The Basic Building Blocks in the PI System ............................. 7

1.3.1 Directed Activity What is an Asset?


1.4

Review of the PI AF structure used throughout this course .................................. 8

1.4.1 Element hierarchy

10

1.4.2 Template hierarchy

11

1.4.3 Element metadata (Static Table Data)

12

1.5 Aggregating the data .................................................................................................... 13

1.5.1 Directed Activity Add a Child Aggregate Attribute

14

Business Intelligence .......................................................................................................... 16

PI AF Tables ....................................................................................................................... 18
3.1 Table Options................................................................................................................ 18
3.2 Table Creation .............................................................................................................. 19

3.2.1 Directed Activity Create Generation Rates Table

23

3.2.2 Exercise Import Emission Rates

27

3.3 Table Lookup ................................................................................................................ 28

3.3.1 Creating a Data Reference

28

3.3.2 Directed Activity Create a Table Look-up Data Reference

29

3.3.3 Exercise Create a Table Reference

32

PI Analysis Service ............................................................................................................ 33


4.1 Capabilities of the PI Analysis Service ....................................................................... 34
4.2 Expressions ................................................................................................................... 34

4.2.1 Directed Activity Calculate Utilization for Assets

37

4.2.2 Directed Activity Bulk Backfill

41

4.2.3 Exercise Calculate Generating Efficiency

43

4.3 Rollups ........................................................................................................................... 44

4.3.1 Directed Activity Calculate Average Utilization for Substations

46

4.3.2 Exercise Calculate Total Gross Generation for Each Station

50

Introduction

Event Frame Generation ................................................................................................... 51


5.1 What are Event Frames? ............................................................................................. 52

5.1.1 Creating Event Frames

52

5.1.2 Time Range Retrieval Methods

52

5.1.3 Directed Activity Create a Temperature Anomaly Event Frame Template 54


5.1.4 Create Inactivity Event Frame Template

60

5.2 Event Frame Generation ............................................................................................. 61

5.2.1 Directed Activity Gas Temperature Anomalies

62

5.2.2 Exercise: Detect Inactive Units

65

Analyzing Events ................................................................................................................ 66


6.1 Objectives ...................................................................................................................... 66
6.2 PI Event Frames in PI System Explorer .................................................................... 66

6.2.1 Directed Activity Search for Inactive Events for GAO01

69

6.2.2 Exercise Search for recent temperature anomalies

72

6.3 PI Event Frames in PI DataLink ................................................................................ 73

6.3.1 Directed Activity How many temperature deviations occurred?

76

6.3.2 Exercise Analyzing Inactivity

78

6.4 PI Event Frames in PI Coresight ................................................................................ 79

6.4.1 Directed Activity Gas Temperature Anomaly Events in PI Coresight

82

6.4.2 Exercise Root Cause Analysis in PI Coresight

84

SQL Query Syntax Overview ............................................................................................ 85


7.1 Dissecting the Syntax .................................................................................................. 85
7.2 PI OLEDB Provider or Enterprise? Whats the difference?................................... 87

7.2.1 PI SQL Commander

88

7.2.2 Directed Activity Review Predefined Queries

89

7.3 Aliases ............................................................................................................................ 92


7.4 Joining Tables ............................................................................................................... 93

7.4.1 Query Short-cuts

94

7.4.2 Directed Activity Manual joins

96

7.4.3 Directed Activity Element descriptions

97

7.4.4 Exercise: Query for specific elements

100

7.5 Built-in Functions ....................................................................................................... 101


7.6 Data Tables ................................................................................................................. 102

7.6.1 Directed Activity Snapshot Values

103

7.6.2 Exercise Interpolated data

105

7.7 Data Transpose Functions & Function Tables ........................................................ 106

7.7.1 Transpose Function Wizard

107

7.7.2 Directed Activity Unit Transpose Functions

109

7.7.3 Exercise Create an Event Frame Transpose Function

114

7.8 Saved views ................................................................................................................. 115

7.8.1 Creating dataset views

115

7.8.2 Directed Activity View Creation for Unit Performance

116

7.8.3 Exercise Create Unit Specification Views

118

8 Importing PI Data for use in PowerPivot ............................................................................................. 119


8.1 Introduction ................................................................................................................ 119
8.2 Importing PI AF datasets .......................................................................................... 119

8.2.1 Directed Activity Importing View Data Previously Created

120

8.3 Linked tables from Excel ........................................................................................... 126

8.3.1 Directed Activity Importing Data with Linked Tables

127

8.3.2 Exercise Prepare Tables for Importing

130

8.4 Table Refresh .............................................................................................................. 131


9 Creating the Cube and Adding Calculations ................................................................................... 132
9.1 Establishing table relationships ................................................................................. 132

9.1.1 Exercise Establishing table relationships

135

9.3 Adding calculated columns using Data Analysis Expression Language (DAX) .... 136
9.4 Where to Use Formulas ............................................................................................. 137

9.4.1 Directed Activity Create a Total Hourly Emissions Calculation

138

9.4.2 Exercise Create a Cost Calculation

139

10 Building the Fleet Generation Report ............................................................................................ 140


10.1 Creating PowerPivot tables ..................................................................................... 140

10.1.1 Exercise One Version of the Truth

144

10.2 Formatting tips ......................................................................................................... 145

10.2.1 Exercise Consistent Table Layout

148

10.3 PowerPivot charts..................................................................................................... 149


10.4 DAX Time Intelligence ............................................................................................. 151

10.4.1 Exercise Create DAX Calculations, Relationship and Sort

152

10.5 Limit Data Viewed by Customers ........................................................................... 153


10.6 Slicers ........................................................................................................................ 154

10.6.1 Exercise Add Slicers; make connections to chart/table

156

11 Exploring the Data with PowerView ................................................................................................... 157

Introduction

12 Final Exercise: Create PowerView and PowerPivot reports ........................................................... 162


13

Scripting in PI ProcessBook (Optional).......................................................................... 163


13.1 Scripting in PI ProcessBook .................................................................................... 163

13.1.1 Directed Activity Add ActiveX controls to a Display

164

13.2 Alarm Sample Overview .......................................................................................... 165


13.3 Setting Up and Acknowledging Alarms .................................................................. 165
13.4 What Triggers Scripts? ............................................................................................ 166

13.4.1 Directed Activity Review VBA of Display

167

13.5 The Trend_TimeRangeChange Event .................................................................... 168

13.5.1 Sync Trends Time Ranges

169

13.6 The Trend_DropCursor Event ................................................................................ 170

13.6.1 Making use of PI ProcessBook Constants

171

13.6.2 Exercise Dynamic Values with Cursor Syncing

172

13.7 External ProcessBook Scripting .............................................................................. 173

13.7.1

The Application Object Allows Relative References

173

13.7.2

How to Get Access to ProcessBook from Excel

173

Appendix A Substitution Parameters ..................................................................................................... 177


Appendix B Additional PowerPivot Resources ...................................................................................... 179
Appendix C Performance Equation Operands and Functions .............................................................. 180
Appendix D PI SQL Commander Table Relationships.......................................................................... 184

1. Welcome
Welcome to the Advanced Client Tools Course!
Since you are attending this class, you should have some experience with OSIsoft Client
Tools (PI ProcessBook, PI DataLink, PI WebParts and PI Coresight), either using displays,
reports or webpages previously created to analyze your data, or creating these displays,
reports and webpages so that others in your organization have access to all the powerful data
that resides in the PI Data Archive and data external to the PI System.
The basic tasks within these tools (as building a PI ProcessBook display or a PI DataLink
report) are presumed to be understood; what you will experience here can be seen as a factory
of ideas, a space for OSIsoft customers to realize how powerful existing data can be when
analyzed with the advanced options of our tools and additional third party tools.
Hope you enjoy!

1.1 Course Environment


The environment for this course is being hosted with Azure. The environment has 3 VM and
contain the following:

PIDC Domain Controller


PISRV1 The PI Data Archive (PISRV1) with PI AF Server (PISRV1) is installed
on this VM.
PICLIENT1 This is the primary working environment for the student.
o The VM has the client tools: ProcessBook, DataLink, PI System Explorer.
o Microsoft Office 2013
o PI OLEDB Enterprise
o The userid and password for each student is pischool\student1 and student
respectively.

Introduction

1.2 Review PI System Architecture


Objectives

Define the components of a PI System


Draw a diagram of the architecture of a PI System
1.2.1 The PI System Described
The PI System collects, stores, and manages data from your plant or process. You connect
your data sources to one or more PI Interface nodes. The Interface Nodes get the data from
your data sources and sends it to the PI Data Archive. Users get data from the PI Data
Archive and display it with client tools.
These are generally the parts involved in a PI System:

PI Server
PI System User

Data PI Interface
Buffer
Source
Table
Lookup DR
PI AF
Relational
Database

Data is collected from the source by the PI Interface program hosted by the acquisition node.
The data is sent to the PI Data Archive and asset data can be contained in the PI AF server. It
is read from the PI/AF Servers by the client tools, such as PI ProcessBook.
1.2.2

Architecture of a Typical PI System

Sometimes the architecture can be very simple. Some customers have as few as one or two
interfaces feeding data to a single PI Data Archive. Access to data is through the single PI
Data Archive.

In many cases there are many PI Data Archives in an organization, aggregating data from
lower levels. Some corporations have PI Data Archives dedicated to servicing their clients
with restricted company data.

1.3 Assets and Tags The Basic Building Blocks in the PI System
Objectives

Define a PI AF Asset with its components element and attributes.


Define the four attribute types: Static (None), PI Point, Formula, and Table Lookup.
Define a PI Data Archive Tag with the attributes Tag Name, Descriptor, and Point
Source.
Define the different data types that can be stored in PI Data Archive Tags.

Introduction

1.3.1 Directed Activity What is an Asset?


The PI Asset Framework (AF) Server is a part of the PI System. It contains asset or
metadata that is usually organized according to the assets containing the tags being
monitored. Assets can be helpful to users of the PI System who do not know or are not
familiar with PI tags. Using assets they can find the data they need without understanding the
technical details of each piece of equipment. Assets are also helpful in finding all of the PI
tags associated with a specific piece of equipment.

1.4 Review of the PI AF structure used throughout this course


Open PI System Explorer (from the Start menu or task bar). The default database should be
Fleet Generation, if it isnt you can click the database button
and select it from the
resulting list. This database models a hypothetical power generation company.

Explore the elements in the hierarchical structure on the left hand


side. The first level of the hierarchy list the regions of our operation,
CENTRAL, NORTH and SOUTHEAST. Within each region are
are a list of stations, e.g. Albertsville, Beryl Ridge, and Carbondale.
Stations may have any number of generating units, so the number of
AF elements configured to represent these units varies from station
to station.

Select any unit in the PI AF


hierarchy and then select the
Attributes tab in the PI System
Explorer window. A list of
attributes associated with the
selected station will display.
Each unit was configured
through a base template named
Unit. The template has
several attributes as seen on the
right. Attributes are defined by
giving them a name and by
specifying a data reference type
which defines the data origin,
e.g. PI Point, Constant, table
look-up, formulas. The Unit
templates attributes are
explained below.

Introduction

Template Definitions
Attribute
Gross Generation
Net Generation
Operator
Technology
Hourly Capacity
Generating Efficiency

Data Reference Type


PI Point - Snapshot
PI Point - Snapshot
Table Lookup from Active
Units table (See AF Library)
Table Lookup from Active
Units table.
Table Lookup from Active
Units table.
Formula: A/B * 100
A= Net Generation
B= Gross Generation

Engineering Units
Megawatts
Megawatts

Megawatt Hours
Percent

1.4.1 Element hierarchy


Element hierarchy i.e., their location in the world is a very valuable class of metadata. This data is
often very useful for Business Intelligence applications as a means to relate assets to one another. The
complication, however, is that the underlying data cube in such applications is very cubic. They thrive on
hierarchies being predictable and regular. Of course, it is possible to use a jagged hierarchy, but this adds
a thick layer of complication. As such, it is always advisable to use a predictable hierarchy.
At Fleet Generation, the hierarchy is relatively boring. Our company does not have much hierarchy
though what we do have is incredibly valuable:

Generation companies often compare sites, and perform analyses site-by-site or region-by-region.
Not every company will be this clear-cut. Some sites will have a hierarchy that matters but which cannot
as easily be described by one universal diagram as above. Other sites wont have as much of a logical
hierarchy, but perhaps there is a hierarchy of organization (Research departments), function (Type of
bioreactor), or location (Which building/room/benchtop/position a reactor is in). It all depends on type of
analysis to be conducted. Perhaps the ambient temperature is thought to be causing growth differences in
different bioreactors, in which case the latter physical location could help sort the reactors into rooms,
proximity to a window, etc.

10

1.4.2 Template hierarchy


A good template hierarchy is the cornerstone of every AF model. This is where AF fulfills business goals
by allowing levels of similarity.
The Fleet Generation contains a simple template hierarchy. The base equipment the generation unit
contains a set of attributes that define all generation units.
In addition to the Unit base template, there are two derived templates,
Gas Turbine and Steam Turbine, which extend off of the Unit base
template. These two derived templates inherit all the attributes of a
Unit template as well as additional attributes specific to either a Gas
Turbine or a Steam Turbine. The hierarchy can be viewed by right
clicking Element Templates within the library, and selecting
Arrange By > Arrange By Template Inheritance.

A gas turbine contains attributes that capture real-time


information on multiple exhaust gas temperatures, gas
fuel flow, gas fuel pressure, and gas turbine speed. A
steam Turbine on the other hand contains attributes for
the stream pressure, steam temperature, and steam
turbine speed. Both the gas turbine and steam turbine
contain the inherited attributes from the Unit base
template, such as Gross Generation and Net Generation.

The gas turbine and steam turbine derived templates define additional attributes specific to each type of
turbine. Both gas and steam turbines inherit the same attributes from the base generation unit.

Building good template hierarchies is perhaps the single most important AF concept.

11

Introduction

Capacity, Net Generation, Gross Generation, and Technology can all participate together in a report of
overall demand and average gross generation. The devices are physically very different, but functionally
all generate power, and this is a valuable similarity for the power generation business.
1.4.3 Element metadata (Static Table Data)
Attributes can be flexible holding numeric values or enumeration values and often already exist as
part of the model.

Static attributes (where the value is persisted directly, not coming from a data reference) can be indexed
to support efficient searching and filtering. Creation of indices is not free, but used sparingly, bring great
efficiency to queries like Show me all of my GE-made turbines. Categories would be a hassle here
since Manufacturer is not a true or false thing; it is a value-from-a-set. However, the turbines already have
an attribute-showing manufacturer, so that could be made static and indexed.
(flag set in the template)

12

1.5 Aggregating the data


Examine the attributes of one of the generation units. Most of the real-time data attributes are rate data,
not aggregate data:

Rate data can tell us:

At a certain time, what is the rate of power generation?

But aggregate data is the kind we want for business intelligence:

For a certain hour, what was the average power generation?


For a certain hour, what was the total energy generated?

So, we will add some additional data aggregates to the Generating Unit template.

13

Introduction

1.5.1 Directed Activity Add a Child Aggregate Attribute


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.

Objectives:

Modify Generation Unit Template


In PI
System
Explorer,
browse
to the Unit template
Add Child
Attribute
to the
Gross Generation
attribute
Configure Attribute to summarize Daily Average of Gross Generation

Add a child attribute to Gross Generation, and name it Daily Average Gross Generation.

Configure the new attribute as follows, being sure to set UOM, type, and configuration fields of the
data reference.

14

Go ahead and check in the changes. Now, we will have access to this attribute, which calculates an
average daily gross generation, whenever a user makes a query for the value in any element that utilizes
the UNIT element template.

15

Introduction

2 Business Intelligence
Business intelligence (BI) tools offer solutions to quickly analyze raw, un-normalized,
multidimensional data. In concert with historical values from the PI Data Archive, metadata
and calculations from PI Asset Frameworks, and business intelligence tools, users can
quickly create interactive reports to gain insight on business and operational processes.
Throughout the rest of the class, we will explore the process of preparing the Asset
Framework model to add additional dimensions of information to our AF database, extracting
desired information (process data, metadata, and event frame data) from the PI System
through PI Data Access tools, consuming the data inside a data cube, and constructing
interactive reports with Microsoft PowerPivot that allow us to slice and dice our data and
bring meaning to our multidimensional data cube.

The Fleet Generation database has a


comprehensive amount of information
including a hierarchy of generating units,
metadata for each unit (operator,
technology), and instantaneous process
data (net or gross generation). The figure
to the right depicts a data cube that
captures metadata and real-time data of
generating units.

Inclusion of additional attributes through


table lookups and analytics on existing
attributes allow for the expansion of
additional columns (or dimensions) to
the data cube above.
Further, historical data, interpolated or
compressed, add an additional
dimension of information that bring
more meaning in Business Intelligence
reports.

16

In the next several chapters in the course, we will expand on the Fleet Generation database to
include meaning data that will help management and engineers make better, more informed
decisions. Specifically we will add value through the following:
1. Pull meaningful rate data into AF from external relational databases.
2. Develop analytics to calculate efficiencies of the Fleet Generation units.
3. Calculate summary statistics over the generation units for each substation.
4. Detect downtime events and thermal anomalies.
Once all this data is centralized in the Fleet Generation database, we will develop queries
using PI OLEDB Enterprise to extract pertinent data so that this information can be
consumed inside of Microsofts Business Intelligence tools, PowerPivot and PowerView.
These BI tools will allow for dynamic, interactive reports.

17

Introduction

3 PI AF Tables
The Fleet Generation database has a simple hierarchy. Additional information can be brought
into the database through the use of tables. Listed below are several terms associated with PI
AF table definition. In each case, the tables can be accessed by PI AF elements though a
table lookup data reference.

3.1 Table Options

External (Linked) Table:


An external table is a PI AF table that is linked to a table in a non-PI relational database or
spreadsheet. The table data is not stored in the PI AF database. You cannot directly edit an
external table from PI AF. Linked tables have connection information stored for the query
and can be refreshed.

Imported Table:
An imported table is a PI AF table created by importing a table in a non-PI relational database
or spreadsheet. Once imported, the table data is stored in the PI AF database and has no
further connection to the non-PI source table. Imported tables are read/write tables.
It is a good practice to limit your imported tables to 10,000 rows of data or less. Imported
tables are not designed for storing very large databases. If you need to access a lot of data in
PI AF tables, use external tables.

Internal Table:
An internal table is any table maintained entirely in PI AF. Internal tables are either imported
tables or tables that are entirely defined and maintained in PI AF. In contrast, linked tables
are external tables because the table data resides outside PI AF.

18

3.2 Table Creation

In the PI System Explorer, navigate to the Library in the Fleet Generation database.

Right-click the Tables collection and select New Table.

Select the General tab for preliminary table definition.


o Update the table name and add a description for clarity.
o Additional fields can be completed as needed.

Table's time zone Date/Time values are displayed depends on whether the Convert
To Local check box is selected.

19

Introduction

o
o

Note:

To convert the DateTimes to the local time, select the Convert To Local
check box.
To always display the DateTimes in the time zone selected in the Time Zone
field, clear the Convert To Local check box.

In the Cache Interval field, enter the amount of time until the table's cached data is
automatically refreshed. In the drop-down menu, choose whether the value is in
seconds, minutes, hours, or days. The default value is zero and designated as
Manual Refresh.

Automatic refreshing is disabled if the table has changes that have not been saved to the server.

Note:

The Persistence Type and Query fields are read-only. PI AF populates these fields when you link
to or import an external table.

20

Define and populate the table in one of these three ways:


o Import a table from outside the PI AF server.
o Link to a table outside the PI AF server
o Manually define and populate the table in PI System Explorer.

Continue with Manual Table Creation.

Select the Define Table tab to define the columns of the table.

Right click in the white space on the display and select the Insert option.

Insert rows for each column required for the table.

Give each column a name; define the value type and unit of measure.

21

Introduction

Once the table is defined, select the Table tab and manually enter the data.
After completing the table definition, Check In the table to save the information.
o A small red check with an asterisk will appear to the left if Check In is
required.

To check in a change, select the Check In


option to save the changes in
the database.
Within the Check In display, new versions can be created with an effective date and
comment for the new version.
o AF Elements can be versioned meaning, if your plant goes from having
two old boilers to one high-efficiency boiler, your hierarchy can reflect the
old reality until a changeover date, at which point the new reality (one boiler)
is reflected in your hierarchy. The old reality (two boilers) are still visible if
you look back in history. So, certain tables appear in two flavors e.g.
AFElement, and vAFElement, where the latter exposes versioning.
o In this scenario, we will not be working with versions.
When check in is complete, the small asterisk and check box will be removed and
the tree will be refreshed.

22

3.2.1 Directed Activity Create Generation Rates Table


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.

Objective:
The engineering team currently has a SQL Server table of generation rates for different types
of generation sources. They would like to have this information stored inside of PI Asset
Framework as they will want to use these rates in later calculations. Since these rates may
change over time, they would prefer not to have static values stored inside of the PI Asset
Framework database and instead have a link to this relational database.

Approach:

In the PI System Explorer, navigate to the Library in the Fleet Generation database.
Right-click the Tables collection and select New Table.
Select the General tab for preliminary table definition. Define and populate the
table through the creation of a link to a table outside the PI AF server
Once the data is imported, verify the data and check the Units of Measure (UOM)
Check In the table to save the information.

23

Introduction

Step 1: From the library plugin of PI System


Explorer, create a new table called
Generation Rates.

Step 2: Click on the Link button to configure a linked table


connection.

Step 3: From the connection dropdown, select <Build>

Step 4: From the Provider tab, select Microsoft OLE DB Provider for SQL Server

24

Step 5: Switch to the Connection tab. Set the server name to PISRV1. Select Use
Windows NT Integrated security for the log on information and select the
FleetGeneration SQL Server database. Test the connection.

Step 6: In the Query textbox, type in:


SELECT * FROM dbo.GenerationRates;

25

Introduction

Step 7: Set the cache interval to 1 hour.

Step 8: Verify that the table data is pulled into PI System Explorer.

Step 9: Set the Unit of Measure of the Rate column to the Energy Costs => $ / kWh

26

3.2.2 Exercise Import Emission Rates


This solo or group exercise is designed to maximize learning in a specific topic
area. Your instructor will have instructions, and will coach you if you need
assistance during the exercise. Please try to solve the problem presented without
using the Solution Guide.

Objective:
Because of environmental purposes, management needs to report on the total carbon dioxide
generation for each of the generating units within Fleet Generation. This information is
currently stored in SQL Server under the Fleet Generation database. As this data does not
change frequently, this data can be imported directly into PI AF. Build a connection to bring
these values into the Fleet Generation AF Database.
Which technologies have the highest generation rate? Which are the lowest?

Approach:

Create a new table, called Carbon Footprint, and connect to the existing SQL Server
database table called Emission Rates.
Import existing data.
Verify UOMs. Emission rate column should be g/kWh.
Check-in the table.

Approximate time 15 minutes

27

Introduction

3.3 Table Lookup


Until now, our experience may have been limited to using PI Points and Formulas as a Data
Reference for the attribute definition. Upon defining tables, the table lookup data reference
can be explored.

3.3.1 Creating a Data Reference

When creating a data reference:

One requirement is identifying the field in the table that relates to the attribute being
created.
A definite advantage of using a PI AF hierarchy is the ability of reference
components of the hierarchy.
o The components that can be reference include elements, attributes, database,
server, etc.
o Using these components as a reference is called substitution parameters.

Substitution Parameters
PI AF does a direct substitution of the substitution parameter for whatever that particular
parameter represents. The format to use a substitution parameter is %referenceditem%.
For example, %Element% is a substitution parameter that represents the element name.
Item Referenced
Attribute
Value of Attribute
Root Element of Attribute

Syntax
%Attribute%
%@Attribute%
%\Element%

A complete list of substitution parameters is in Appendix A.

28

3.3.2 Directed Activity Create a Table Look-up Data Reference


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.

.
Objective: Add a data reference to the Unit Element Template associated with the
Generation Rates table to enable calculating the cost of generation.

Approach:

Open PI System Explorer.

In the Browser, select the Unit element template.

In the Viewer, create a new attribute template.

Name the field Rate.

Set the default UOM.

Select Table Lookup as the data reference.

Select Settings.

29

Introduction

Select the Generation Rates Table.


The Result Column is Rate and will be populated with this table lookup.

Note:

Select the Stepped check box for the value to be stepped when plotted in a trend. With this setting,
there is no interpolation between the table values.

30

In the Unit of Measure box will be populated by the referenced table.

In the Behavior section, select the appropriate radio button:


o First row matching criteria.
Use the Order by menus to specify the sorting order. This order is used to
select a row when more than one row matches the criteria.
o Summarize all rows matching criteria.
Choose a summary operation from the Summary menu to do the selected
operation on the selected column over the range of rows that match the
criteria. (Operations include SUM, Average as AVG, Minimum as MIN,
Maximum as MAX, COUNT, Standard Deviation as STDEV, and Variance
as VAR.)
o Table provided time series data.
Choose this option if the table has values with associated time stamps and
you wish to treat these values as time series data. From the Time Column
menu, select the table column that contains the time stamps you want to use.

Only columns with a value type of DateTime appear in the menu. The
WHERE clause is not required when you choose this option.

In the Where section, use the menus and buttons to build the table query.

Note:

You can manually type the entire clause into the Complete WHERE Clause text field. The
construction/syntax of the WHERE clause will be reviewed later in this course.

In the Column box select the column of the table to use in the query.

In the Operator box select the relational operator to use in the query. The
interpolation option will return an estimated (interpolated) value for the result column
based on the values contained in the referenced input columns.

In the Attribute or Value box select an attribute or a literal value to use in the query.

Click Add And or Add Or to write the WHERE clause into the Complete Where
Clause box with an AND / OR operator.

In the Complete Where Clause box, edit the clause as needed.

Note that the Add And or Add Or buttons in the dialog box automatically generate
the necessary syntax, UOM, and time zone conversions when possible.

31

Introduction

3.3.3 Exercise Create a Table Reference


This solo or group activity is designed to maximize learning in a specific topic area.
Your instructor will have instructions, and will coach you if you need assistance
during the activity. Please try to solve the problem presented without using the
Solution Guide.

Objective:
Regulations on carbon emissions is continually growing tighter. The amount of carbon
emissions must be reported by all power companies based on the type of technology.
Which units have the highest generation rate? Which have the lowest?
Approach:

Open up the Unit Element Template in the Fleet Generation Database.


Add a new attribute called Carbon Emissions
Set the Carbon Emissions attribute to a Table Lookup based on the technology
type.
Check-in the Unit Element.
Verify the individual unit elements were updated with the carbon emissions amounts.

Estimated Completion time 15 minutes.

32

4 PI Analysis Service
PI Asset Framework is a powerful tool to help model the infrastructure of a company, region,
or division. Through PI Asset Framework Formula Data References, you can create simple,
on-the-fly calculations. PI Asset Framework also comes packaged with the PI Analysis
Service, for more advanced analyses. The analytic capabilities include three analyses types,
Expressions, Rollups, and Event Frame Generation, which allow for calculations to be
applied at the template level as well as the ability to persist the results back to the PI Data
Archive.
There are alternate methods of calculation to the PI Analysis Service. These calculation
methods can be on the PI Asset Framework (PI Analysis Service, PI AF Data References),
the PI Data Archive (PI PE tags, PI Totalizer tags) or on dedicated hardware (PI ACE). It
should be noted that only the PI Analysis Service and PI ACE allow for complex
calculations, but only the PI Analysis Service can develop the calculations thorough
configuration of an object, without requiring developer level technical expertise. This is in
addition to the other benefits of PI Asset Framework, such as templatization and inclusion of
non-PI data. A comparison table is below.

CREATION
METHOD
TECHNICAL SKILL
LEVEL
PERSISTENCE/
HISTORIZE
ASSET BASED
CALCULATION
TEMPLATES
SUPPORT
INCLUSION OF
NON-PI DATA
PI HA SUPPORT
RECALCULATION
BACKFILLING
HANDLES
COMPLEX
CALCULATIONS

PI ANALYSIS
SERVICE
Configuration

PI AF DATA
REFERENCES
Configuration

PI ACE

PI PE TAGS

Programmed

Configuration

PI TOTALIZER
TAGS
Configuration

Basic

Basic

Developer

Basic

Basic

Yes (or No)

No

Yes

Yes

Yes

Yes

Yes

Yes (MDB)

No

No

Yes

Yes

Yes*

No

No

Yes

Yes

Yes

No

No

Yes
No
Yes
Yes*

Yes
No
No
No

Yes
Manual
Manual
Yes

No
Manual
Manual
No

No
No
No
No

* Complex calculations are limited to PI PE syntax and cannot be called recursively. For
more advanced functions including calling external libraries, PI ACE must be used.

33

Introduction

4.1 Capabilities of the PI Analysis Service


The PI Analysis Service, runs as a service that monitors all analyses and attributes associated with these
analyses.

Expressions:
Expressions allow for multi-lined calculations that utilize mathematical operators and functions, ifconditions, and PI time-based functions to perform advanced analyses. Expressions, created for a given
asset type (element template), are automatically applied to all elements of that type. Results can be written
directly back to the PI Data Archive.
Rollups:
Rollups allow for the calculation of summary statistics (averages, maximums, minimums) of values from
a set of AF attributes. Current statistical values can be written directly to the PI Data Archive.
Event Frame Generation:
PI Analysis Service allows for the automatic detection of events that occur. These events are bookmarked
and information for any event type can be retrieved for further analysis.
Scheduling:
Expressions and Rollups can be scheduled to run whenever a new event arrives into the PI Data Archive
or calculated on a periodic basis.
Backfilling:
Results from all three types of analyses can be backfilled into the PI System.

4.2 Expressions
With Expressions, you can implement calculations through a set of built-in functions that take values of
attributes in PI Asset Framework as inputs, and outputs results to other PI AF attributes. Expressions can
be scheduled to run periodically or scheduled to run whenever the input parameters of the expressions
receive a new value.

34

Multi-line calculation dependency allows for each expression to be written to different output attributes as
well as re-using calculated results in subsequent expressions.

Each set of expressions allows for periodic or event-triggered scheduling.


Example

Function Category
Archive Value Statistics
Date and Time
Logical
Math
Operators
PI Data Archive Digital States
Point Attributes
Search and Retrieval
Statistical
Status
String

TagAvg, PctGood
Bod, Hour
And, If
Abs, Sqr
>, <>, *
DigState, DigText
TagSpan, TagType
TimeEq, NextEvent
Rand, Total
NoOutput, TagBad
Len, Text

A set of built-in performance equation-like syntax allows for access to a range of functions.
The available options include mathematical and logical operators and functions, date and time
functions, PI-specific performance equation functions, and string manipulation functions.

35

Introduction

It is recommended to configure analyses at the template level.


The following procedure can be used to configure an Expression analysis using a template:
1) In the AF Database Library, create a new analysis template of type Expression.
2) Define expressions for the calculations in the analysis template.
3) Define the scheduling for the analysis template.
4) Define output attribute templates to store results.
5) Create the PI tags used to store the results.
6) Evaluate and preview the data to validate calculations.
7) Backfill the calculation if required.

36

4.2.1 Directed Activity Calculate Utilization for Assets


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.

Objective:
The capacity utilization is a percentage that represents the amount of electrical power that a
unit produced against its theoretical capacity. Configure, test, run, and validate analyses to
calculate the percent utilization of all generating units.
Approach:

In the PI System Explorer, navigate to the Library in the Fleet Generation database.
Under Element Templates, select the UNIT element template.
Select the Analysis Templates tab to configure the multi-lined expression for
Utilization:
Utilization = Total Hourly Gross Generation / Hourly Capacity
Specify and configure an attribute template to store the results.
Schedule the calculation to run periodically every hour.
Backfill unit GAO01 for the past seven days.

37

Introduction

Step-by-step approach
Step 1: From the Unit Template, found in the Library plug-in of the Fleet Generation database,
select the Analysis Templates tab.

Step 2: Configure a new analysis. Name the analysis Utilization and set the analysis type to
Expression.

Step 3: Configure the expressions for the hourly total of Gross Generation and Utilization.
HourlyTotal = TagTot('Gross Generation','*-1h','*') * 24
Utilization = HourlyTotal / 'Hourly Capacity' * 100

Note: The HourlyTotal must be multiplied by 24, as the Performance Equation function TagTot assumes
the units of the input attributes are per day. Conversion factors should not be used elsewhere with PI
Asset Framework, as UOM conversions occur automatically.
Step 4: Define two new output attribute templates, Total Hourly Gross Generation and
Utilization, to store the calculated results. Define the UOM to be MWh and %.

38

Step 5: Create the PI Tags


After the new attribute template has been configured, switch over to the Element Hierarchy.
The attribute values for the new tags should be Pt Created. If not, right-click on the root
Elements object. Select Create or Update Data Reference to automatically create the PI tags to
store the calculated results.

Step 6: Switch back to the Unit Template Analysis Templates tab to schedule the Analysis
Template to run periodically at the top of each hour.

Step 7: Drill down to GAO01 from the element hierarchy. Select the Analysis
tab and the Utilization analysis. Click on the Evaluate button to validate the
expressions.

39

Introduction

Step 8: Prior to backfilling data into the PI Data Archive, right-click on the analysis to preview
the results over the past 7 days.

Step 9: Right-click on the analysis and select Backfill. Specify the start and end time of *-7d
and *, respectively to begin the backfill process.

Step 10: Once data has been backfilled, observe the current value of
the Utilization attribute. Right-click on the attribute and select Time
Series Data to validate the backfilling.

40

4.2.2 Directed Activity Bulk Backfill


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.

Objective: By now, all the analyses for event frame generation have been set up for all the
units of Fleet Generation. In order to perform deeper analysis over the data during the past
week, backfilling needs to be performed.

Approach:
Step 1: From PI System Explorer, select the Analyses plugin.

Step 2: From the filter dropdown selection, select Analysis Template. Select all analyses tied to the
Utilization analysis template.

Step 3: Select the Backfill operation and set the start time to *-7d

41

Introduction

Step 4: Click the Queue button to queue the backfilling.

42

4.2.3 Exercise Calculate Generating Efficiency


This solo or group activity is designed to maximize learning in a specific topic area.
Your instructor will have instructions, and will coach you if you need assistance
during the activity. Please try to solve the problem presented without using the
Solution Guide.

Objective:
Power generators require electricity to operate. The net generation is defined as the amount of
gross generation, or the amount of electricity that a generator produces, less the electricity
required to operate the unit. Calculate the generating efficiency, or the ratio between the net
generation to the gross generation, expressed as a percentage.
Which unit is performing with the greatest efficiency?

Approach:

In the PI System Explorer, navigate to the Library in the Fleet Generation database.
Under Element Templates, select the UNIT element template.
Select the Analysis Templates tab to configure the expression for generating
efficiency.
Specify and configure an attribute to store the results.
Schedule the calculation to run periodically every hour.
Backfill all generating efficiency analyses for the past seven days.

43

Introduction

4.3 Rollups
The second analysis capability of the PI Analysis Service Analytics is known as rollups.
Rollups allow for the calculation of summary statistics for a set of attribute values.
The types of summary statistics that are allowed are:

Sum
Average
Minimum
Maximum
Count
Median

Examples of rollup calculations include:

Total mass of all contents in a tank farm

Total production from all generating units for a particular site

Maximum temperature of boilers within a building

Average engine temperature of mining trucks

Average temperatures for each asset with varying temperature sensors.

Selecting attributes to rollup


Attributes used in rollup calculations can come from 1) attributes from child elements relative
to the element of interest or 2) the element of interest. One can set a search criteria to specify
the specific attributes to rollup. Depending on the source of the attributes (child elements or
current element), the search criteria includes a masking pattern for the 1) Attribute Name, 2)
Attribute Category, 3) Element Category, and 4) Element Template.

44

What is an element Example?


During the configuration of a rollup template analysis, when the source of the attributes to
roll up are from the child elements, PI System Explorer is not aware of which parent element
to retrieve child elements from. As such, when configuring a roll-up analysis template, you
will need to specify an example element. Note that when configuring a roll-up at the element
level, one will not need to select an example element as the child elements are from the
specific, selected element.

Scheduling and backfilling


Similar to Expressions, the rollup analyses can be scheduled to run as new events come into
the PI Data Archive or scheduled to run periodically. The PI Analysis Service also allow the
results from Rollup calculations to be written back to the PI Data Archive.

The general process to properly configure and backfill an analysis template is:
1)
2)
3)
4)
5)
6)
7)
8)

Create a new analysis of type Rollup.


Define the source of the attributes to rollup (child element or current element).
Select the type(s) of summary statistics to calculate.
Define output attributes to store results.
Define the scheduling for the analysis.
Create the PI tags used to store the results.
Evaluate and preview the data to validate calculations.
Backfill the calculation.

45

Introduction

4.3.1 Directed Activity Calculate Average Utilization for Substations


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.

Objective: Management would like to have visibility over the average percent utilization of
all generating units for each substation. Roll up the average utilization to the substation level.
Approach:

46

Open up the Station Element Template from the Fleet Generation Database Library.
Add a new analysis called Average Utilization with analysis type of Rollup.
Select Central\Albertsville as the example element.
Specify the criteria to select the attributes used for the rollup calculation.
Select the summary statistic function for the average.
Specify the output attributes (be sure to create the tags).
Schedule the calculation to be event-triggered.
Verify data.
Backfill for the past 7 days.

Step-by-Step Approach
Step 1: From PI System Explorer, select the Fleet Generation Library plugin. Then select the
Station Element Template. From the Analysis Templates tab, create a new Analysis called
Average Utilization.

Step 2: Specify the rollup attributes from child


elements. Set the example element to be
Central\Albertsville.

Step 3: Set the attribute name field to utilization. This mask will automatically select all
Utilization attributes from the child elements of the Albertsville station.

Step 4: Set the scheduling to be event-triggered. Each time the Utilization analysis finishes
calculating each hour, the rollup analysis will run.

Step 5: Select Average as the rollup function and create a new Output Attribute called
Average Utilization. Set the default UOM of this new attribute.

47

Introduction

Step 6: Click on the Evaluate button to verify the result of the rollup function.

Step 7: Check-in your changes.


Step 8: From the element hierarchy, verify that the PI tag exists for the attribute.

48

Step 9: From the Analysis pane, backfill your Average Utilization rollup analyses for the past seven days.

49

Introduction

4.3.2 Exercise Calculate Total Gross Generation for Each Station


This solo or group exercise is designed to maximize learning in a specific topic
area. Your instructor will have instructions, and will coach you if you need
assistance during the exercise. Please try to solve the problem presented without
using the Solution Guide.

Objective:
Management would like to gain more insight into the total gross generation at each station.
Create a rollup analysis to totalize the hourly gross generation at the station level.
Which station produces the most power?
Approach:

50

Open up the Station Element Template from the Fleet Generation Database Library.
Add a new analysis called Total Hourly Gross Generation with analysis type of
Rollup.
Select Central\Albertsville as the example element.
Specify the criteria to select the attributes used for the rollup calculation.
Select the summary statistic function for the summation.
Specify the output attributes (be sure to create the tags).
Schedule the calculation to be event-triggered.
Verify data.
Backfill for the past 7 days.

5 Event Frame Generation


Events are important process or business time periods that represent something happening
that affects your operations. In the PI System, events are known as event frames. Thanks to PI
Event Frames, you can analyze your PI data in the context of these events rather than by
continuous time periods. Instead of searching by time, PI Event Frames enables users to
easily search the PI System for the events they are trying to analyze or report on.
With PI Event Frames, the PI System helps you capture, store, find, compare and analyze the
important events and their related data.
PI Event frames represent occurrences in your process that you want to know about, for
example:

Downtime tracking

Environmental monitoring
excursions

Process excursions

Product tracking batches

Equipment startups and shut


downs

Operator shifts

The following table presents some of the features and advantages of PI Event Frames:
Reference multiple elements within the same
event.
Flexibility

Support multiple overlapping events on a PI


AF element.
Capture any event; a "batch" is just one type
of capturable event.
Search by time range, type of event or event
frame attribute.

Powerful search

Most common search attributes can be


configures as indexed attributes to speed up
end-user searches

Scalability

PI Event Frames are extremely scalable.

A PI Event Frame is defined by three characteristics:


1. Name.
2. Start time and end time: defines the events time range.
3. Context: event attributes and related assets.

51

Introduction

5.1 What are Event Frames?


5.1.1 Creating Event Frames
The Fleet Generation database contains a series of Elements representing the regions and
units associated with each generation plant. In order to keep up with the power demands, it is
important that the plant is up and running. We need to keep track of the uptime associated
with the generation plant.
A Unit Status attribute is associated with each generating plant in our hierarchy. This
attribute will be used to monitor the uptime associated with each plant.
5.1.2 Time Range Retrieval Methods
There are three time range retrieval methods, the use of which depends on what data is to be captured, and
how it is to be displayed.
Time Range
This method allows a time range to be supplied by the end user. When any single value query is made,
this period of time is used for calculations. If however a period of time is supplied from an application,
such as a generated Event Frame or Coresight display, then the user specified time range is discarded and
the application time period is used.
Time Range Override
The Time Range Override behaves in the same way as the Time Range method during all single value
queries, as uses the user specified time period. When a period of time is supplied from an application, the
application time range is discarded and the user specified period is used.
Not Supported
Not Supported does not allow for a time range to be supplied by the end user. As such, an error is
returned by any request for a single value. If a period of time is supplied however, then this range is
adopted by the method for the calculation. The result is then the same from the Time Range method.

There are different use cases for the methods, so care must be taken to ensure the correct method is used.
METHOD

SINGLE VALUE

APPLICATION SUPPLIED

TIME RANGE

User Specified range result

Application Specified range result

TIME RANGE OVERRIDE

User Specified range result

User Specified range result

NOT SUPPORTED

Error: This attribute requires a Time

Application Specified range result

Range to calculate a value in

52

Single timestamp query results (sample element with 1h specifications)

Application supplied time range query results (sample 3h event frame)

53

Introduction

5.1.3 Directed Activity Create a Temperature Anomaly Event Frame Template


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.

Objective:
The gas turbines in the Fleet Generation database each have three temperature sensors. Create
an Event Frame template with appropriate attributes to help monitor and analyze potential
issues with gas turbines. The event frame should capture the real-time data specific to gas
turbines and the current status and duration of the gas turbine.
Approach:

54

Create an Event Frame template.

Create a template called Gas Turbine Temperature Anomaly, none of the additional
fields on the General Tab are required.

Select the Attribute Templates tab. Right click in the white space to create an
attribute.

Name the Attribute Status.

Select Enumeration Sets => Status as the value type.

Select the PI Point Data Reference, then select Settings.

The attribute name is in the format .\Elements[.]|Unit Status. The Event Frame
references a PI AF Element. The [.] syntax points to this PI EF Templates primary
referenced PI AF element within the Elements collection.

Note: Each parent object has a default collection type. For example, a PI System has a default collection
of databases, and an AF database has a default collection of elements.

Set the By Time Range dropdown option to Start Time.

Note: Substitution parameters cannot be used to make a reference to an attribute from the Element
Template that is not a PI Tag.

Upon completing the definition, click OK. The Settings will be completed as seen
below:

55

Introduction

56

Create a second attribute to store the Duration of event frame.

Set the settings for the attribute as:

Create a third attribute to store the Technology. For the Value Type, select String and
for the Data Reference, select String Builder.

Note: When the event frame attributes data reference is set to PI Point, the syntax .\Elements[.]|Attribute
only allows for the reference to PI Point Data Reference attributes. Element attributes configured as
formulas and table lookups cannot be passed to event frames using a PIPoint Data Reference. Instead, for
attributes configured as formulas or table lookups, select String Builder as the data reference.

Set the settings for the attribute as .\Elements[.]|Technology:

57

Introduction

Continue to create the following additional attributes. Make sure units are properly
set. Hint: You can start by copying and pasting these attributes templates from the
Gas Turbine element template.
1.
2.
3.
4.
5.

58

Exhaust Gas Temperature - #1 Probe


Exhaust Gas Temperature - #2 Probe
Gas Fuel Flow
Gas Fuel Pressure
Gas Turbine Speed

For each of the attributes, set the


reference attribute to
.\Elements[.]|%attribute% and set the
By Time Range option to Start Time.

Note: %attribute% will substitute in the name of the event frame attribute template. This will then point
to the corresponding attribute in the referenced element. You can also select multiple attributes when
making modifications to the attribute configuration.

59

Introduction

5.1.4 Create Inactivity Event Frame Template


This solo or group activity is designed to maximize learning in a specific topic area.
Your instructor will have instructions, and will coach you if you need assistance
during the activity. Please try to solve the problem presented without using the
Solution Guide.

Objective:
Generating units sometimes trip or go down. Management would like to understand these
downtimes, and determine how much demand was not serviced. Event frames can help
capture and bookmark these events for future analysis. Develop an Event Frame template,
called Inactivity, with fields required to track the desired plant information to create reports
for management. Specifically, management would like to know the following:
1. Demand Real-time
2. Operator Metadata
3. Technology Metadata
4. Carbon Emissions Metadata
5. Total Demand Real-time, Aggregation of Demand
6. Duration in seconds
7. Hours Down in hours
8. Status Real-time
Hints:
1) For metadata, use String Builder as the Data Reference.
2) For Total Demand, configure the attributes By Time as Not Supported and By
Time Range as Total
3) Verify correct event frame template configuration through the creation of a test event
frame.

60

5.2 Event Frame Generation


The Event Frames Generation analysis allows for the automated detection and generation of
event frames in the PI AF database based on values from trigger attributes. The type of events
and the types of data captured inside each event are defined with event frame templates in PI
AF.

Some notable features of Event Frame Generation in the PI Analysis Service include the following:
Generate events: Easily configure event generation and automatically generate your events from the
trigger tags that are already collecting data in the PI Data Archive.
Handle multiple event types: Generate all your different event types, such as downtime, excursions,
batches, and other events, on the same asset with no restrictions on overlapping events.
Standardize using event frame templates and populate event attributes: Different event types have
different attributes and information that are important for analysis. Standardize your events using event
frame templates, and use the PI Analysis Service to automatically populate events attributes with data
from the PI Data Archive and PI Asset Framework.
Backfill events: PI Analysis Service enables you to define your history backfill time window, then it
backfill the events from previous time periods automatically.
Using PI AF element attributes as event triggers or event attribute values: Trigger conditions for
event frames can be linked to element attributes.
Configure using PI AF element templates: Apply the configuration of event frame detection and
generation to PI AF element templates. The same event detection automatically applies to newly created
assets of the same asset type. There is no need to configure the event frame generation again.
Root Cause: Event frames are great for capturing events that have occurred. However, often
times, the time period prior to the event provides more information on the cause of the event.
PI Analysis Service allows for root cause analysis and will capture a fixed time period
(default five minutes) before the event start time for further analysis. This will be recorded as
a Child Event Frame.
Time True: The trigger condition for event frames could potentially be noisy. PI Analysis
Service allows for the specification of a minimum time true period before an event frame will
generate.
Event Frame Generation in PI Analysis Service does not have the following capability at this time:
Child Event Frames: Generation of child event frames. Interestingly, event frames can contain child
event frames to capture sub-events that have occurred during the time period of the main event frame.

61

Introduction

5.2.1 Directed Activity Gas Temperature Anomalies


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.

Objective:
Each gas turbine has multiple temperature sensors. If any temperature reading deviates more
than 20% from the average, then servicing is required. Use the Gas Turbine Temperature
Anomaly Event Frame Temperature to help define these types of events.

Approach:
Step 1: From the Fleet Generation Library, select the Gas Turbine Element Template and
select the Analysis Templates tab. Create a new analysis template called Gas Temperature
Anomaly.

Step 2: Set the example element to GAO01.

Set the Event Frame Template to Gas Turbine Temperature Anomaly.

62

Step 3: Define new expressions called AvgTemp and DeltaTemp. Set the expressions to:
Avg('Exhaust Gas Temperature - #1 Probe','Exhaust Gas Temperature - #2 Probe')
'Exhaust Gas Temperature - #1 Probe' - 'Exhaust Gas Temperature - #2 Probe'

Step 4: Define the StartTrigger as:


IF (AvgTemp-Abs(DeltaTemp/2))/AvgTemp > 0.2 THEN TRUE ELSE FALSE

Step 5: Enable the generation of a child root cause event frame. Set the duration to 30 minutes.

Step 6: Set the scheduling to Event-Triggered and triggering to Any Input.

63

Introduction

Step 7: From the Analyses plug-in, backfill event frames for the past seven days for all Gas Turbine
Temperature Anomaly analysis templates.

64

5.2.2 Exercise: Detect Inactive Units


This solo or group activity is designed to maximize learning in a specific topic area.
Your instructor will have instructions, and will coach you if you need assistance
during the activity. Please try to solve the problem presented without using the
Solution Guide.

Objective:
Engineering would like to perform a deeper analysis into events over the past week in which
the generating units are inactive. They would like to examine the 5 minutes prior to the event
to understand the cause. Configure the event frame generation to automatically capture new
events and detect historical events.
How many inactive events have been occurring?

Approach:

Open up the Unit Element Template from the Fleet Generation Database Library.
Add a new analysis called Inactive Units with analysis type of Event Frame
Generation.
Specify the event frame template: Inactivity.
Define the trigger condition to automatically detect inactive events.
Verify data.
Backfill for the past seven days.

65

Introduction

6 Analyzing Events
6.1 Objectives
PI Event Frames are stored in a relational database on the SQL Server hosting the AF
databases. These event frames can be viewed, filtered, analyzed using PI tools such as PI
System Explorer, PI Coresight, and PI DataLink.

6.2 PI Event Frames in PI System Explorer


The easiest way to view PI Event Frames is through PI
System Explorer. From the Event Frames Pane, you can
perform searches against all the event frames within an
AF database. You can filter based on specific referenced
elements, specific time ranges, and much more.

From the properties of an Event Frame Search, you can specify the following search
parameters for the time of the event frame, and the properties of the event frame:
Search type: Specify how to perform an event frame search. Find all event frames that are
entirely between a start and end time? Starting or ending between a start and end time?
Search start: Specify the start time for event frame search.
Search end: Specify the end time for event frame search.
Include descendants: Search for all child event frames in addition to parent event frames.

Event Frame Name: Filter based on the name of an event frame. Can use wildcards.
Element Name: Filter based on the name of the referenced element. Can use wildcards.
Template: Filter based on the event frame type.

66

Additional Criteria: Ability to filter based on duration, attribute value, event frame search
root, and specify how many results to return.

The resulting search query is combined into a string within the search field. This allows for
direct manipulation of the data fields without using the menu options.

The default search results brings back fields detailing the duration, start time, end time,
description, category, template, and a Gantt chart. Any of these fields can be hidden by using
the settings cog on the top right corner of the search results. Additionally values from the
event frame attributes can be pulled back into the search results through this same option list.

67

Introduction

Because PI Event Frames are essentially bookmarks that reference data from elements, tables,
and itself through formulas, whenever you perform a search for event frames and their
attribute values, the search can take a while. Fortunately, built into PI Asset Framework is the
ability to capture or persist the event frame attribute values. These values are stored in a
relational database on the SQL Server.
You will notice that there is a blue pin icon to the left of many event frames. That basically
means that the values are persisted or captured. Any event that PI Analysis Service
automatically detects will have its values captured.

If you do update the event frame template, however, you may need to recapture the values as the existing
event frame attribute values will not automatically update to the new changes. To recalculate, right-click
on a set of event frames and select recapture.

68

6.2.1 Directed Activity Search for Inactive Events for GAO01


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.

Objective:
Find all Inactive events for the unit GAO01 and GAO02 over the past 24 hours. Examine the
technologies that are involved in these inactive events.

Approach:
Step 1: Click on the event frame plug-in. Right-click on Event Frame Search 1 and select Properties.

69

Introduction

Step 2: From the Event Frame Search screen, specify the search start to *-1d, end to *, and uncheck
the All Descendants checkbox. For the Element Name textbox, specify GAO0? and set the Template to
Inactivity.

Step 3: The search will return several inactive event frames. Select all of them and click on OK.
Step 4: Click on the gear icon to the right of the fields, and remove the description and category fields.
Then click on Select Attributes.

70

Step 5: Select the Technology attribute from the Select Attributes wizard.

Step 5: Examine the Technology that is leading to the downtime for these Inactive Units.

71

Introduction

6.2.2 Exercise Search for recent temperature anomalies


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.

Objective:
Find all temperature anomaly events for the gas turbines over the past 48 hours that last for
more than one hour. Pull back the temperatures for each of the two gas temperature sensors.
Which unit has the highest starting Gas Fuel Pressure during a temperature anomaly, and when
was it?

Approach:
Perform an event frame search and format results for the desired attributes.

72

6.3 PI Event Frames in PI DataLink


PI DataLink allows you to retrieve current, historical, and calculated data back into Microsoft
Excel. In addition to these capabilities, PI DataLink also allows for the retrieval of event
frames back into Excel for further analysis.

There are two retrieval methods for Event Frames inside of PI DataLink:
Explore: Find Event Frames that meet the specified criteria and display them in a
hierarchical format, which is useful to analyze events sharing the same EF template.

Compare: Find Event Frames that meet the specified criteria and compare their attributes in
a flat format. This allows a flat list of events with attributes relating to child events all within
a single row.

For either the Compare or Explore Events, you can specify parameters to search for specific
event frames. You can specify the following:
Database: AF Database to search against.
Event Name: Search pattern to search for specifically named event frames.
Search Start: Search for all event frames that occurred after this time.

73

Introduction

Search End: Search for all event frames that occurred before this time.
Event Template: Search for specific types of events.
Element Template: Search based off of the type of referenced element.
Element Name: Search pattern for the name of the event frame.
More search options: Search based on attribute values, duration, and category.
Number of child event levels: Only for Explore Events and allows for the hierarchical
display of events.

Searching for event frames can be based off of multiple attributes.

74

When searching with Explore Events, the results can be displayed hierarchically based on the
relationships between child and parent event frames.

75

Introduction

6.3.1 Directed Activity How many temperature deviations occurred?


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.

Objective:
Temperature deviations could potentially mean damaged machinery. Engineering is interested
in analyzing the Natural Gas units. Find out how many instances of temperature deviations
occurred for gas turbines that lasted for more than 30 minutes.
Approach:
Step 1: From PI DataLink inside of Excel, specify the Database as \\PISRV1\Fleet Generation,
Event name as gas turbine*, Search start as *-1d, and Event template as Gas Turbine
Temperature Anomaly.

Step 2: From the more search options, set the minimum duration to 30 minutes and the technology to
Natural Gas.

76

Step 3: Select the columns that you would like to display:

77

Introduction

6.3.2 Exercise Analyzing Inactivity


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.

Objective:
Inactivity events can be costly as the generating units are not generating any power. Analyze
with PI DataLink the total number of Inactivity events as well as the total amount of time the
units were in an Inactive state.
Which generating unit had the most downtime events? Which generating unit had the largest
total downtime?

Approach:
Use PI DataLink to search for PI Event Frames and specify which attributes to return. Use Excel to
aggregate the events.

78

6.4 PI Event Frames in PI Coresight


PI Coresight is a browser- and mobile-based, ad-hoc visualization tool to help visualize and
analyze assets. PI Coresight allows for easy drag and drop capabilities to add symbols onto a
display canvas.

Symbols available
Trend: Add multiple attributes from assets onto a trend for easy visualization of process
changes.
Table: Display attribute values onto a table for easy comparisons. Allows for summary
statistics.
Value: View the current or historical value for any metric or process.
Horizontal Gauge: Horizontal gauge to view current levels within a range.
Vertical Gauge: Vertical gauge to view current levels within a range.
Radial Gauge: Radial gauge to view current levels within a range.

79

Introduction

Related Assets
Element templates are powerful at defining and
normalizing all similarly-typed assets. Templates also
allow for easier searches. With Related Assets in PI
Coresight, you can create one display for a particular
asset such as a turbine and reuse this display for other
turbines.

Events
Found in the same section as Related Assets, the
Events section displays all PI Event Frames that are
associated with the asset that is visualized in the PI
Coresight display.

The events shown have a start time, end time, and


name. If you click on the clock icon to the left of an
event, PI Coresight will automatically update the start
and end time of the display to match those of the event
frame.

80

When you expand out an event frame, you can


view the start time and end time. If you click on
the icon to the left of either timestamp, you can
update the corresponding timestamp for the
display.

The Data Items section shows all the event frame


attributes and their values. These items can be
added to the display canvas.

The Assets section shows all the related assets for


that particular event frame.

If there are child event frames, they are listed as


well.

81

Introduction

6.4.1 Directed Activity Gas Temperature Anomaly Events in PI Coresight


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.

Objective:
Visualize a temperature deviation event for GAO01 using PI Coresight.
Approach:
Step 1: Create a new PI Coresight display. Drill down to asset GAO01.

Step 2: Trend the Exhaust Gas Temperature Probes for the past 24 hours.

82

Step 3: Click on Related Assets/Events and view the events that have occurred for GAO01. Select the
most recent Gas Temperature Anomaly event and drag the event onto the display as a table.

83

Introduction

6.4.2 Exercise Root Cause Analysis in PI Coresight


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.

Objective:
Engineering is interested in analyzing the process values 30 minutes before GAO01s
temperature values deviated. Use PI Coresight to create an ad-hoc display to quickly analyze
this data.
Approach:
Use PI Coresight to manipulate the start and end times to help analyze root cause.

84

7 SQL Query Syntax Overview


SQL stands for Structured Query Language. SQL is an American National Standards Institute
(ANSI) definition for the language used to communicate with relational database systems. It
is used by virtually all relational databases in the world today. (Even the PI Data Archive has
a SQL Subsystem that can act as a translator to make it look like a relational database).
SQL Commands are often called SQL Statements. They can be executed interactively or
as stored procedures.
The good part is that it is a standard and that every relational database you encounter will
understand it. There is no need to learn many languages. However, there is a down side. Most
databases have unique extensions and/or syntaxes that are unique to those systems.
To give a simple example, when passing dates into Access you use pound signs (#) for
surrounding dates. On the other hand, in SQL Server you need to use apostrophes (').
Access: [...] WHERE dtColumn >= #2001-11-05#
SQL Server: [...] WHERE dtColumn >= '20011105'
A SQL result set is a set of rows from a database, as well as meta-information about the query
such as the column names, the data types and sizes of each column. Depending on the
database system, the number of rows in the result set may or may not be known. Usually, this
number is not known up front because the result set is built on-the-fly.
This flexibility allows for complex queries to be constructed and saved to return a very
specific subset of information from the AF Database that would be either too cumbersome or
impossible through the likes of PI System Explorer or PI Datalink.
Trivia: The result is stored in a result table, called the result-set. This table is held in memory.
This is often referred to in code as rs.

7.1 Dissecting the Syntax


A common SQL syntax starting command is SELECT which is used to query the database. The data
retrieved from the statement is based on the criteria specified in the SELECT statement.
Following the SELECT command identifies the columns to be selected from the tables(s).
SELECT * - retrieves all the columns from the table being referenced.
SELECT column1, column2, column3 retrieves 3 columns of the table being referenced.
The FROM command identifies the first (or perhaps only) table being queried.
SELECT * FROM tablename retrieves all the columns from tablename.

85

Introduction

SELECT column1, column2, column3 FROM tablename retrieves all data for the 3 columns of
tablename.
The WHERE command contains criteria to filter the data being retrieved.
The conditional operators include:
equal (=)
greater than (>)
less than (<)
greater than or equal (>=)
less than or equal (<=)
not equal to (<>)
LIKE (which is a pattern matching operator)

Note: If the conditional clause is set to compare to text, the text value is encased in single quotes (text).
SELECT * from tablename WHERE column1 = 5
Retrieves only rows where column1 has a value equal to the number 5.

AND and OR statements

AND indicates both statements must be TRUE for the row to be returned when the
query is executed.
o

SELECT column1, column2, column3 from tablename WHERE column1 = 5


and column2 = junk

Retrieve only rows where column1 has a value equal to the number 5 and
column2 value equals junk.

OR returns data rows if either condition is met


o
o

86

SELECT column1, column2, column3 from tablename WHERE column1 = 5


or column2 = junk
Retrieve only rows where column1 has a value equal to the number 5 or
column2 value equals junk.

The LIKE operator is used to search for a specific pattern in a column. In conjunction with the LIKE
operator a wildcard of % is used for comparison. The % can represent a single character or multiple
characters. Another wildcard is the underscore (_) which can be used to represent a single character.
SELECT * from tablename WHERE column2 LIKE %unk
Retrieves rows from tablename where column2 values end with the letters unk
SELECT * from tablename WHERE column2 LIKE %un%
Retrieves rows from tablename where column2 values contain the letters un
SELECT * from tablename where column2 like _un_
Retrieves rows from the tablename where column 2 values only contains 4 characters and
the middle two characters are un.
SELECT * from tablename WHERE column2 LIKE j%
Retrieves rows from tablename where column2 values start with the letter j

To work with column/table names which have special characters, such as a space, use square brackets:
If you wish to SELECT a column called Product Orders, enclose it in square brackets: [Product Orders]
If youre referring to a table whose full path is Fleet Generation, Region, Station, Unit, that must be
written as [Fleet Generation].[SouthEast].[Brick Canyon].[PLT02]
Any name may be wrapped in square brackets, so when in doubt as to what constitutes a special character,
wrap the name in square brackets.

7.2 PI OLEDB Provider or Enterprise? Whats the difference?


PI OLEDB Provider is an OLEDB data provider that provides access to the PI System.
Given the correct security, the PI OLEDB Provider allows read/write access to the PI System
Archive.
PI OLEDB Enterprise is an OLEDB data provider which provides access to the PI System in
a relational view, accessible through SQL queries. The PI OLEDB Enterprise provider
supports read-only access to asset and event data stored in the PI Asset Framework (AF),
such as AF Elements, AF Attributes and PI Event Frames
At the Current time, PI OLEDB Enterprise is read only to the PI AF Server/PI System. It
cannot build Elements or structure and it cannot update values in PI AF.

87

Introduction

7.2.1 PI SQL Commander


The PI OLEDB Enterprise installation includes a test environment which handles the OLE
connection process and allows the user to execute queries and perform other tasks. This test
environment is PI SQL Commander.
PI SQL Commander is the user interface to assist with creating queries, transpose functions,
and views against PI AF using PI OLEDB Enterprise. This user interface also provides
access to the classic PI OLEDB provider which builds queries against the PI Data Archive
components without knowledge of PI AF.

88

7.2.2 Directed Activity Review Predefined Queries


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.

Objectives:
Review predefined queries associated with the tables defined in PI SQL Commander.
Approach:

Open PI SQL Commander


Navigate to the Fleet Generation Database/Catalog
Execute a Predefined Query associated with the Element Hierarchy table.

Launch PI SQL Commander Click Start > All Programs > PI System > PI SQL Commander.

In PI SQL Commander, verify that your PI SQL Object Explorer is visible. If it is not, click View >
PI SQL Object Explorer.

Through PI SQL Commander, either a PI AF Server or a PI Data Archive can be accessed


through SQL statements based on the item selected for connection.

From within the PI SQL Commander Object Explorer, connect to your AF Server, in this
example, PISRV1, by right-clicking Select Connect then select Windows Integrated Security.

89

Introduction

An arrow next to the server icon indicates that the connection


is successful:

After connecting to the PI AF server, you will see a catalog


list in the PI SQL Object Explorer. The catalogs listed
correspond to each of the PI AF databases you have
configured for this PI AF server. We will be using the Fleet
Generation database throughout this course.

Right-click an object in the catalog that represents a table, view or function, and then select
Execute Predefined Query.

PI OLEDB Enterprise includes one sample SQL query for each table in the catalogs for PI AF
server.

90

The PI SQL Commander is shown below. The right side consists of the query editor on top
and the results grid below.

This is the environment for building and testing PI OLEDB Enterprise queries. Queries,
written in the editor, can be executed with their results shown in the grid.

Upon selecting Execute Predefined Query, a query window will appear with a Select statement for the
ElementHierarchy will be developed.

SELECT *
FROM [Fleet Generation].[Asset].[ElementHierarchy]
WHERE
Path = N'\' -- root elements

The above query does not yield all elements, just the elements at the Region level.

(Note: the N declares the path string to be Unicode, which permits lots of different characters. It will
be omitted throughout this document since normally we are only dealing with standard ASCII characters.)

91

Introduction

Modify the query to retrieve all the elements.


SELECT *
FROM [Fleet Generation].[Asset].[ElementHierarchy]
WHERE
Path like N'\%'

OR
SELECT *
FROM [Fleet Generation].[Asset].[ElementHierarchy]

Returns the same results.

7.3 Aliases
Sometime table name or columns are lengthy or lack clarity. Using an ALIAS can simplify typing and
clarify table field names that are otherwise unclear. The AS command defines an ALIAS for the item
prior to the AS with the abbreviation following the command.
SELECT eh.* FROM [Fleet Generation].[Asset].[ElementHierarchy] as eh
In the above statement, EH can be used to identify the table instead of the full [Fleet
Generation].[Asset].[ElementHierarchy] table name. Aliases become more significant when creating
joins.

92

7.4 Joining Tables


Rarely does data exist in one place or in one table. Sometimes the results of a query has to come from a
correlation of two or more distinct tables. To JOIN tables, a relationship is required between the tables
and must be identified in the SQL statement.
Two key words are used when creating joins between tables. The words JOIN and ON can be used in
the statement to identify the relationship between the tables being used. The key word ON sets up
the relationship of columns in the selected tables so the desired rows are returned.
SELECT *
FROM [Fleet Generation].[Asset].[ElementHierarchy] as EH JOIN [Fleet
Generation].[Asset].[ElementAttribute] as EA
ON eh.name = ea.name
Returns no records. Below is a result of the next query, but shows the names in the tables are not the
same.

Even though both tables have columns called names, they do not identify identical fields.
Note: Columns named the same are not necessarily referring to the same item. For example, id is a
column that is frequently found in tables representing a unique identifier for the row, but rarely do they
refer to the same item from table to table.
However, the columns named ElementID in both tables are actually the same and return a listing of all
attributes for all elements defined.
SELECT *
FROM [Fleet Generation].[Asset].[ElementHierarchy] AS EH join [Fleet
Generation].[Asset].[ElementAttribute] AS EA
on eh.elementid = ea.elementid

93

Introduction

7.4.1 Query Short-cuts


Field Aliases:
Theres an unsightly problem with the query: multiple columns are named the same (Name) but are not
the same. For anyone reading these query results, this is not helpful.
The solution is to rename the columns. Just as a table can be aliased, so can a column be aliased. The
keyword AS is used anytime an ALIAS is defined, whether the field is a table or column name.
SELECT e.Name AS [Element Name], ea.Name [Attribute Name]
The above statement gives meaningful names to the columns in the respective tables.
Equi-Joins
An equi-join is a specific type of comparator-based join that uses only equality comparisons
in the join-predicate.

The next three statements return the same results.


o The SQL statement is written in a JOIN ON format:

SELECT * FROM [Fleet Generation].[Asset].[ElementHierarchy]


join [Fleet Generation].[Asset].[ElementAttribute]
ON [Fleet Generation].[Asset].[ElementHierarchy].ElementID = [Fleet
Generation].[Asset].[ElementAttribute].ElementID

94

The SQL Statement written in Equi-join format:

SELECT * FROM [Fleet Generation].[Asset].[ElementHierarchy], [Fleet


Generation].[Asset].[ElementAttribute]
WHERE
[Fleet Generation].[Asset].[ElementHierarchy].ElementID = [Fleet
Generation].[Asset].[ElementAttribute].ElementID

The SQL Statement written in Equi-join format with ALIASes:

SELECT * FROM [Fleet Generation].[Asset].[ElementHierarchy] AS eh, [Fleet


Generation].[Asset].[ElementAttribute] AS ea
WHERE eh.ElementID = ea.ElementID

The above statement illustrates a case for the use of ALIASes; making the statement much
easier to type and read.

95

Introduction

7.4.2 Directed Activity Manual joins


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.
Objective:
Extract snapshots in the Fleet Generation database along with their elements.
Approach:

96

Compare the different tables and see how they can be joined together as a class
What is the Exhaust Gas Temperature of Probe #1 on GAO01? Where does this asset
reside?
What type of asset is GAO01, along with its parent and grandparent, up the
hierarchy?

7.4.3 Directed Activity Element descriptions


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.
Objective:
To extract the elements in the Fleet Generation database along with their descriptions.
Approach:

Execute the Predefined Query associated with the Element Hierarchy modify to
obtain all elements.
Review the fields in the Element table.
Determine potential relationships between the tables to create joins.
Modify the Element Hierarchy query to add the appropriate join information to
extract the description of the elements.
Locate the Element Hierarchy within SQL.
Execute the Predefined Query.
SELECT *
FROM [Fleet Generation].[Asset].[ElementHierarchy]
WHERE
Path = N'\'

Modify the query to obtain all elements


SELECT *
FROM [Fleet Generation].[Asset].[ElementHierarchy]

Review the ElementHierarchy table, which gives us the Path and Name (among other
things) of the elements in the hierarchy, but no description:

Review the Element table. Is there a link between the Element and Element
Hierarchy table?

97

Introduction

Each element pointer within the hierarchy (i.e. each row in the ElementHierarchy
table) corresponds to an element object from the overall set of Elements (i.e. a row in
the Element table). Behind the scenes, these objects are linked by GUIDs (Globally
Unique Identifiers). The purpose of a GUID is to give individual objects an identifier
guaranteed to be unique. Meaningless to the human eye, they look like:
9abd6084-6c74-4645-a7a0-833f6c25de3d
GUIDs (often contained in table columns ending in ID) are how each table relates
each row (element pointer) in ElementHierarchy to each row (element) in Element:

Modify the Element Hierarchy table to include the description from the Element
table.
SELECT eh.path, eh.name, e.description
FROM [Fleet Generation].[Asset].[ElementHierarchy] eh INNER Join [Fleet
Generation].[Asset].[Element] E on eh.elementid = e.id

98

Note: In the above statement, the tables have ALIASes, but the word AS is not in the statement as it is
understood.

The results of the above query yields the name of the element and the description
associated with the element.

99

Introduction

7.4.4 Exercise: Query for specific elements


This solo or group activity is designed to maximize learning in a specific topic area.
Your instructor will have instructions, and will coach you if you need assistance
during the activity. Please try to solve the problem presented without using the
Solution Guide.
Objective:
To extract the elements in the Fleet Generation database that are Units (Element Template) and are
located in the North Region. The fields that we want in our result set are the Unit Name, Path, and
Description.
Approach:

100

Execute the Predefined Query associated with the Element Hierarchy modify to
obtain all elements.
Review the fields in the Element Templates table.
Determine potential relationships between the tables to create joins.
o Hint: The Element table has a field called ElementTemplateID
Append a WHERE clause to filter based on the Path and Element Template.
Determine the fields to return and the tables associated to each field.

7.5 Built-in Functions


PI SQL Commander has some built-in functions specific to the PI System. If you are
familiar with SQL, you may already be familiar with functions. For example, aggregation
functions such as Max() or Avg() return the maximum or average of a group of rows.
An entire list of built-in functions are available in the user guide for PI OLEDB
Enterprise.
One of the PI functions that will be used in subsequent exercises is ParentName(). Instead
of returning the complete PATH. The ParentName function of PI OLEDB Enterprise is
used to break up the AF element path name into separate columns of the table. The
strings in double quotes are used to rename the column name in the table to something
perhaps better suited for reporting. Again, the eh ALIAS prefix is required to identify
the source of the field.
SELECT
eh.Name [Unit]
, ParentName(eh.Path,0) [Station]
, ParentName(eh.Path,1) [Region]
FROM [Fleet Generation].[Asset].[ElementHierarchy] eh
Where eh.Level=2

101

Introduction

7.6 Data Tables


In the previous sections, we saw the process to query for elements from the Fleet Generation database
through a series of table joins between Asset Framework object tables within PI SQL Commander. The
tables within PI SQL Commander are not limited solely to Elements, Element Hierarchy, and Element
Templates.
Within PI SQL Commander, there are several tables under the [AF Database].[Data] path that will allow
the user to extract real-time and archive values from the PI Data Archive. A query against these tables
will return either Element Attribute data or Event Frame data. In order to utilize these tables, a query
needs to have an INNER JOIN to the ElementAttribute table and a specific Data table. The
ElementAttribute table allows for the mapping between the data and specific attributes associated with a
set of elements.

The tables corresponding to Element Attribute data are listed below:


Table

Description

Archive

Returns archive / compressed data

Snapshot

Returns values in the snapshot (current values)

ft_InterpolateDiscrete

Returns interpolated value given timestamp

ft_InterpolateRange

Returns interpolated values at evenly distributed timestamps

ft_Plot

Returns minimum data required for trending

Note: Similar tables exist for data from Event Frame attributes. Typically, only the Event Frame Snapshot
table is queried against as each Event Frames contain individual start and end times.

102

7.6.1 Directed Activity Snapshot Values


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.

Objective: Create a query to extract real-time values for the Gross Generation and Net
Generation attributes of all units. The fields in the result set should contain Element Name,
Station Name, Region Name, Attribute Name, Timestamp, and Value.

Approach:
The creation of this query requires several steps.
1) Run the predefined query of the Element table to obtain all elements.
2) Remove the WHERE clause that filters the results to all root elements.
SELECT *
FROM [Fleet Generation].[Asset].[Element] e
3) Apply an INNER JOIN to the Element table.
SELECT *
FROM [Fleet Generation].[Asset].[Element] e
INNER JOIN [Fleet Generation].[Asset].[ElementHierarchy] eh
ON e.ID = eh.ElementID
4) Apply an INNER JOIN to the ElementTemplate table.
Hint: Use aliases.
5) Add a WHERE clause to return only elements that are Units.
SELECT *
FROM [Fleet Generation].[Asset].[Element] e
INNER JOIN [Fleet Generation].[Asset].[ElementHierarchy] eh
ON e.ID = eh.ElementID
INNER JOIN [Fleet Generation].[Asset].[ElementTemplate] et
ON e.ElementTemplateID = et.ID
WHERE et.Name = 'Unit'
6) Apply an INNER JOIN to the ElementAttribute table.
7) Append to the WHERE clause to return only attributes that are either Gross
Generation or Net Generation.

103

Introduction

SELECT *
FROM [Fleet Generation].[Asset].[Element] e
INNER JOIN [Fleet Generation].[Asset].[ElementHierarchy] eh
ON e.ID = eh.ElementID
INNER JOIN [Fleet Generation].[Asset].[ElementTemplate] et
ON e.ElementTemplateID = et.ID
INNER JOIN [Fleet Generation].[Asset].[ElementAttribute] ea
ON ea.ElementID = e.ID
WHERE et.Name = 'Unit' and (ea.Name = 'Gross Generation'
OR ea.Name = 'Net Generation')
8) Apply an INNER JOIN to the Data Snapshot table.
9) Specify the fields for the result set.
SELECT
eh.Name [Unit]
, ParentName(eh.Path,0) [Station]
, ParentName(eh.Path,1) [Region]
, ea.Name [Attribute]
, s.Time
, s.Value
FROM [Fleet Generation].[Asset].[Element] e
INNER JOIN [Fleet Generation].[Asset].[ElementHierarchy] eh
ON e.ID = eh.ElementID
INNER JOIN [Fleet Generation].[Asset].[ElementTemplate] et
ON e.ElementTemplateID = et.ID
INNER JOIN [Fleet Generation].[Asset].[ElementAttribute] ea
ON ea.ElementID = e.ID
INNER JOIN [Fleet Generation].[Data].[Snapshot] s
ON s.ElementAttributeID = ea.ID
WHERE et.Name = 'Unit' and (ea.Name = 'Gross Generation'
OR ea.Name = 'Net Generation')

104

7.6.2 Exercise Interpolated data


This solo or group exercise is designed to maximize learning in a specific topic
area. Your instructor will have instructions, and will coach you if you need
assistance during the exercise. Please try to solve the problem presented without
using the Solution Guide.

Objective:
Create a query to extract hourly interpolated data for the Demand attribute of all units over
the past four hours. The fields for the result set should include Element Name, Attribute
Name, Timestamp, and Value.
At what time does the demand tend to be high over all units?

Approach:
The creation of this query requires several steps.
1) Run the predefined query of the ft_InterpolateRange table.
2) Remove the portion of the WHERE clause that filters the results to all root elements.
3) Modify the Start Time, End Time and TimeStep.
4) Apply an INNER JOIN to the Element table.
5) Apply an INNER JOIN to the ElementTemplate table.
6) Add a WHERE clause to return only elements that are Units.
7) Append to WHERE clause to return only the Demand attribute.
8) Restrict the SELECT to return the desired fields.

105

Introduction

7.7 Data Transpose Functions & Function Tables


As seen above, the data comes back in tabular form, but does not lend itself to easy interpretation.
Below is the query from the previous section that illustrates the difficulty in reviewing the snapshot data
for the attributes for the Elements. Notice that the attributes as returned in rows.
SELECT eh.name, ea.name, s.time, s.value
FROM [Fleet Generation].[Asset].[ElementHierarchy] eh
INNER JOIN [Fleet Generation].[Asset].[ElementAttribute] ea
ON ea.ElementID = eh.ElementID
INNER JOIN [Fleet Generation].[Data].[Snapshot] s
ON s.ElementAttributeID = ea.ID
OPTION (FORCE ORDER, EMBED ERRORS)

A portion of the results from the above query is displayed below.

which is, in many senses, a dumb way to look at the data. It is much more legible to the human eye if
we rotate or transpose the values as:

PI OLEDB Enterprise can generate transpositions similar to what is above. A wizard walks
you through the process of creating a transpose function for any Element Template of your
choosing.

106

7.7.1 Transpose Function Wizard


For various use cases, such as reports or OLAP cubes, attribute values need to be returned in
a way so that each column represents an attribute. This is contrary to a typical relational
representation, where each value of each attribute is normally represented in consecutive
rows. To represent multiple attributes in this "one column per attribute" format, one could
join data tables with itself multiple times, but the resulting query string would be rather large
and complex. To help with this, we provide a way to create custom Table-Valued Functions
(TVFs) and derived function tables, to get "transposed" result sets of the related data tables.
Under each PI AF database branch, there are four folders, Assets, Data, DataT and
EventFrame.

The Tables folder under Data shows the tables and columns that provide access to
snapshot and historical data from the PI System.

107

Introduction

Under both the Assets and Data folders there are two additional folders called Views
and Functions. These folders are initially empty and provide places for you to organize the
views and functions you create.
In general, creating and editing queries and views is a restricted activity. The changes are contained in
the PI AF SQL Database and access will be controlled by the database administrator.
The DataT branch of the hierarchy is for working with transpose functions.

108

Transpose functions allow you to obtain tables of PI AF information based on AF


element templates.
This folder comes with the same subfolder structure as Assets and Data, but they
are initially empty until they are manually created.
Transpose functions can be create using the wizard discussed in the next section.

7.7.2 Directed Activity Unit Transpose Functions


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.

Objective: Create a transpose function for the Fleet Generation database to be used in
analyzing plant generation data.
Approach:
There are four transpose functions available. Each transpose function returns a
dataset made up of columns for every attribute of an element template, where each
row returns values based on a different time basis.

Function

Snapshot or Archive

Rows/element returned

Transpose Snapshot

Snapshot

1 row per element attribute for element(s)


selected

Transpose Archive

Archive

1 row per element archived attribute value


for element(s) selected over a specific
time range

Transpose Interpolate
Discrete

Interpolated

1 row per element returns interpolated


value based on timestamp

Transpose Interpolate
Range

Interpolated

1 row per element returned for each


interval for each element attribute based
on time range and interval

109

Introduction

Create the transpose functions required for our report.

Step 1
Access the Transpose Function Wizard by right-clicking on the DataT folder under the AF

Database catalog you wish to build a transpose function for, and select New Transpose
Functions Asset

Step 2
Select the PI AF template(s) you want to
create transpose function for. You can select
as many as you want, the wizard can build
multiple transpose functions per pass. Here,
we will select UNIT and click Next.

Step 3
If the element template has sub-attributes, they
will appear in this dialog and you can select
the ones you want included. The UNIT
template has no child attributes, so do not
select include subtree then click Next.

110

Step 4
Select the type of transpose functions you want
the wizard to create. Notice that there are
actually two sets of functions. Those starting
with the letter v allow you to specify the
version of the element template (based on its
effective date) you wish to build the transpose
function from.
In this example, we will not use the versioning
capability of PI AF, so select the
TransposeSnapshot and
TransposeInterpolateRange functions. These
functions will be created for each of the element templates selected in the previous step. Then
click Next.

Step 5
The wizard will provide names for the functions.
For PowerPivot, its best to clear the Values as
VARIANT check box. In this example, we will
be working with transpose function tables, so
check the Create functions tables check box.
The wizard will now create the transpose
functions and their associated transpose table for
the Transformer template. Click Next.

Step 6
Review the summary and click Next.

111

Introduction

Step 7
After the wizard processes your request, click
Done. The Transformer transpose functions and
tables are now ready to use.

Step 8
From the PI SQL Commander hierarchy, the
transpose tables and functions created by the wizard
should appear under the DataT folder of the Fleet
Generation PI AF database.

112

As for how to use your newly-created Function, examine the snapshot function it in
Object Explorer:

Reality check: If we call the TransposeSnapshot_Data function, the same columns


exist in the function as in the original template, such as Effective Generating
Capacity, Generating Efficiency, etc.

A Predefined Query is associated with transpose functions. Execute the query.

The default query retrieves no values. Modify the query to return all Units.
Remove WHERE eh.Path = N'\'

Re-execute the function after removing the where clause.

SELECT eh.Path + eh.Name Element, tc.*


FROM [Fleet Generation].[Asset].[ElementHierarchy] eh
INNER JOIN [Fleet Generation].[DataT].[ft_TransposeSnapshot_UNIT] tc
ON tc.ElementID = eh.ElementID
OPTION (FORCE ORDER, IGNORE ERRORS, EMBED ERRORS)

113

Introduction

7.7.3 Exercise Create an Event Frame Transpose Function


This solo or group activity is designed to maximize learning in a specific topic area.
Your instructor will have instructions, and will coach you if you need assistance
during the activity. Please try to solve the problem presented without using the
Solution Guide.

Objective:
Attributes from the generation units and the event frames will be used to analyze production
data from the plants.

Approach:
Use the transpose function wizard to create a snapshot event frame function using the
Inactivity and Gas Turbine Temperature Anomaly template.
Verify the results of the transpose function through the execution of the pre-defined query.
Hint: The steps are almost identical to the ones used when creating an Asset transpose
function.

114

7.8 Saved views


Sometimes it is necessary to generate reports on a routine basis. To minimize the effort
required to generate the data, creating views maybe an option. Lets save our previous query
as a view to enable frequent generation of the data.
7.8.1 Creating dataset views
PI SQL Commander supports the creation of views. Views allow you to name a stored query
and it is this name that appears in the table list when importing data into PowerPivot. Views
are the easiest way to allow users to select which datasets they want from PI AF when
creating a PowerPivot report, as they do not need to understand the complexity of the
underlying SQL query.
Views are created using SQL syntax, but OLEDB Enterprise can give you a template to start
with. If youre trying to create a saved query showing information about assets, consider
creating it in the Asset schema (folder). If you have a saved query showing data values, for
organizational purposes, place it in the Data schema. The image below shows a right-click
menu giving the Create View option:

Selecting Create View produces the beginning of a query:

At this point, it is a matter of naming the View by replacing <view name> and copy pasting
the query by into <query> placeholder.

115

Introduction

7.8.2 Directed Activity View Creation for Unit Performance


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.

Objective: Create a view, Unit Performance, using previously created asset interpolated
range transpose functions for frequently changing process data (Transpose Interpolate
Range).
Approach:

Run the Unit Interpolated Range transpose function using the Execute Predefined
Query. Note that no INNER JOIN to the ElementTemplate table is necessary as the
transpose function was created against the UNIT template.

Modify query to obtain all unit information by removing the Path portion of the
WHERE clause.
WHERE eh.Path = N'\' AND tc.StartTime = DATE('*1h') AND tc.EndTime = DATE('*') AND tc.TimeStep = '30m'

Modify date range to include data from the past week.


WHERE tc.StartTime = DATE('t7d') AND tc.EndTime = DATE('t') AND tc.TimeStep = '1h'

Create a view from the transpose function.

Fill in the required information to create the view.

Name the View: Unit Performance


Use modified transpose function for the query

116

CREATE VIEW [Fleet Generation].[DataT].[Unit Performance]


AS
SELECT eh.Name [Unit Name]
, eh.ElementID [Unit ID]
, tc.Time
, tc.[Demand]
, tc.[Net Generation]
, tc.[Gross Generation]
, tc.[Utilization]
, tc.[Generating Efficiency]
, tc.[Total Hourly Gross Generation]
FROM
[Fleet Generation].[Asset].[ElementHierarchy] eh
INNER JOIN [Fleet Generation].[DataT].[ft_TransposeInterpolateRange_UNIT] tc
ON tc.ElementID = eh.ElementID
WHERE

tc.StartTime = 'T-7d' AND


tc.EndTime = 't' AND tc.TimeStep = '1h'

OPTION (FORCE ORDER, IGNORE ERRORS, EMBED ERRORS)

Execute the function.

If successful, a successful message will display, otherwise, an error will be displayed


in the lower region of the query section.

Refresh the View section and verify the Unit Performance View is present.

The definition for the view can be seen by selecting the Alter option in the View
folder.

117

Introduction

7.8.3 Exercise Create Unit Specification Views


This solo or group exercise is designed to maximize learning in a specific topic
area. Your instructor will have instructions, and will coach you if you need
assistance during the exercise. Please try to solve the problem presented without
using the Solution Guide.

Objective:
Create an additional view, Unit Specifications, containing slow changing Unit metadata, for use in
managing PowerPivot reports.
What else could you add to this view? What sort of data would you like to extract from this
view?

Approach:
Create a view that returns metadata using the query from the TransposeSnapshot_Unit.
The result query set should contain the following fields:
1) Unit Name
2) Unit ID
3) Station Name
4) Region Name
5) Carbon Emissions
6) Hourly Capacity
7) Operator
8) Technology
Approximate time: 20 minutes

118

8 Importing PI Data for use in PowerPivot


8.1 Introduction
Business Intelligence tools are only as good as the data that fills them. This lab centers upon
the process of exposing assets and data from the PI System into data cubes. We will continue
to use the database from a fictitious generation company called Fleet Generation.
There are two main components to tackle: preparing the PI System for the cube, and writing
the queries which expose useful information. The former will be a recurring theme as we
perform the latter.
The release of PI OLEDB Enterprise, Microsoft PowerPivot for Excel and Microsoft Power
View provide an exciting combination of new technologies supporting advanced data analysis
and enterprise awareness. These tools bring the power of multidimensional data analysis to
the forefront for all PI users allowing innovated reporting within Microsoft Excel and in
Microsoft SharePoint. This training session describes the steps needed to create an example
PowerPivot report for analyzing Fleet Generation.
Note: PowerPivot is standard in Excel 2013 and is a free add-in to Excel 2010 and 2007 and is available
from Microsoft at http://www.microsoft.com/en-us/bi/powerpivot.aspx.

8.2 Importing PI AF datasets


The first thing to do when using PowerPivot is to import the data you want to analyze.
Importing data requires connecting to the data source holding the data, specifying the data
you need from the data source (by selecting a database table, view, or writing a query), and
then importing the data into PowerPivot. The PowerPivot input dialog also gives you the
ability to preview and to specify additional filters on the data as it is imported. However, for
this example, the following steps will describe how to import the complete datasets from the
PI OLEDB Enterprise views defined in the previous section of this document.

119

Introduction

8.2.1 Directed Activity Importing View Data Previously Created


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.

Objective: All the pre-work is complete, now it is time to start building the report.

Approach:

Open MS Excel
Select the PowerPivot tab to access the PowerPivot ribbon shown below. Clicking
on the Manage button will launch the PowerPivot window shown in the next step.

Explore Get External Data


The PowerPivot window is empty, waiting to import data. Exploring the Get
External Data section of the ribbon gives a great overview of the various ways
PowerPivot can access data sources.

Clicking on the
icon for From Other Sources will start the dialog for
importing data from other data sources, like PI OLEDB Enterprise. (From Other
Sources icon is

120

in Excel 2010.)

Select Others(OLEDB/ODBC)

Scroll down the list of connection types until you


get to the Others (OLEDB/ODBC). Select it and
click Next.

Name the connection Fleet Generation

Build Connection String

PowerPivot requires a connection string to access each


data source. Click the Build button to construct the
string for connecting to PI AF through PI OLEDB
Enterprise.

121

Introduction

Select PI OLEDB Enterprise

Select the Provider tab and scroll through the providers


list until you come to PI OLEDB Enterprise.

Update Integrated Security

Select the All tab and then edit the Integrated Security
value by selecting this item from the list, clicking Edit
Value and typing SSPI in the edit dialog. This tells the
provider PowerPivot is going to attempt authentication via
Windows Integrated Security.

Click OK to close the Edit Property Value window and


click OK again to close the Data Link Properties window.

122

Complete Data Source, login info and database.

Select the Connection tab and enter your PI AF server


name, PISRV1, as the Data Source. We will use
Windows NT Integrated security. Once you set the data
source, the PI AF catalog (database) list should get
populated in the bottom dropdown. Expose the list and
select the Fleet Generation PI AF database.

Test the connection

The connection string to the Fleet Generation database on


the PI AF server has been configured.
Click the Test Connection button to verify your
connection. Close the connection test pop-up and click
Next to continue with the data import process.

123

Introduction

Select How to Import the Data

Now that we have defined the connection to the Fleet


Generation PI AF database, we have to specify what data
we want to import into PowerPivot. Since we have taken
the time to define views for our datasets in PI SQL
Commander, make the top choice and click Next.
Selecting the second option will take you to a query
builder where you can enter any query you like. This
approach usually works best when you have an
application like PI SQL Commander to test your query
first before copying and pasting it in to the dialog
provided.

Select Unit Performance and Unit Specifications

The last task is to select the two views we created using


PI SQL Commander for the Fleet Generation database.
Views will be at the end of the list as the grid puts tables
first. These tables are provided by PI OLEDB Enterprise
for every PI AF database.
Click Finish to start importing data into PowerPivot.

124

Review importing results

At this point, PowerPivot will begin importing data from


PI AF. When the import is complete, click Close.

The imported data will now appear in the PowerPivot window. Each dataset will have its
own worksheet and can be accessed by selecting the appropriate tab at the bottom of the
PowerPivot window.

125

Introduction

8.3 Linked tables from Excel


In this report, we would like to analyze total profitability based on our profit margin for each
generating technology. This information, while available in our PI AF model, is not available
in either of the two views that we created, but it can easily be imported into Excel from SQL,
and added to PowerPivot. Alternatively, the data can be entered into a table in our Excel
spreadsheet and added to PowerPivot as a linked table.
This linked table can be then connected into our PowerPivot data cube to enable an additional
dimension for slicing and dicing.

126

8.3.1 Directed Activity Importing non PI Data


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.

Technology
Rate
Coal
0.034
Biomass
0.054
Geothermal
0.069
Natural Gas
0.078
Wind
0.12
SQL table
Nuclear
0.083
Select the PowerPivot tab to access the PowerPivot ribbon. Within Get External Data, this
time select From Database and From SQL Server.
The profitability numbers for each generation technology have been
entered into the GenerationRates within the FleetGeneration SQL
database. Alternatively, the data can be brought in through an Excel
spreadsheet.

Within the table import wizard, the local SQL server and the target FleetGeneration database
can be specified. A default name of SqlServer PISRV1 FleetGeneration is built from the
connection type, SQL host server and the database name. This can be changes to something
more friendly, like Generation Rates

127

Introduction

Click next and choose to select from a list of tables to import the data. If we had a large
dataset, it would be preferable to write a query so that the input data can be filtered and joined
with any relevant tables, similar to our created Views within PI SQL Commander. That way
Excel does not need to parse an excessively large result file. As this table only has 6 rows,
we can bring in the entire table.
From the table selection, we have the option to bring in tables EmissionRates and
GenerationRates. The single connection can field requests to any list of tables within the
database, simplifying the data source establishment to be SQL database specific. Currently
we only want the GenerationRates table, so only select that row. Filtering conditions can also
be added here if we wanted to reduce the same set. A friendly name for the table can also be
used, such as Generation Rates

Once imported, the table will then be visible within the data model.

128

Excel Table
A similar process can be used to grab data from within an Excel worksheet, and create a linked table.
The profitability numbers for each generation technology have been entered into a separate
worksheet of the Student Starter spreadsheet named OtherTables. Copy the table into our
existing workbook. (The rate is $/KWh located under the Energy Cost under the Unit of
Measures.)
From the Insert Ribbon, select Table and specify the cells containing our generation rate data.

Give the table a meaningful name, such as GenerationRate, since the default is Table# where # is the
number in the sequence of table addition.
Place the cursor in any cell of the table, select the PowerPivot menu ribbon. From the PowerPivot
menu, select Add to Data Model.

This will link the data you have entered into Excel with the data in PowerPivot. Like the datasets we
already imported, each linked table will have its own tab at the bottom of the PowerPivot window. Since
the table in Excel does not support spaces, you may rename the table in PowerPivot to include a space.

129

Introduction

8.3.2 Exercise Prepare Tables for Importing


This solo or group exercise is designed to maximize learning in a specific topic
area. Your instructor will have instructions, and will coach you if you need
assistance during the exercise. Please try to solve the problem presented without
using the Solution Guide.

Objective:
Bring static data into Excel for use in creating production data cube.
Which method of importing data do you prefer? What the advantages and disadvantages of
each?
Approach:

Create a new spreadsheet in the existing Excel workbook.


Type in the table information displayed on the bottom.
Add to data model.
Rename tabs to a meaningful name, Zip Codes.
City

Zip Code

Albertsville

55301

Beryl Ridge

52403

Carbondale

26241

Ebbitt

14605

Greenlawn

11740

Madison

28269

New Bedford

95115

Brick Canyon

26330

Carter

30627

Octavia

34470

Stampton

29684

Vicksberg

39180

Wolverine Station

35990

Expected completion time: 10 minutes

130

8.4 Table Refresh


PowerPivot will always remember the connection strings and queries used to obtain the tables
of data it uses.

You can refresh data from the PowerPivot window using the Home ribbon and selecting
Refresh for the displayed table, or Refresh All for every table.

131

Introduction

9 Creating the Cube and Adding Calculations


Once the tables are imported into PowerPivot, the cube is almost ready to be created. In
order to create a report, relationships between the tables must be defined.

9.1 Establishing table relationships


At this point all we have given PowerPivot are four independent tables of data. In order for
us to analyze data in PowerPivot, these tables must be related. PowerPivot provides a very
easy way to specify table relationships through a configuration dialog shown right.
When building PowerPivot relationships, the order is significant. Always think many to
one (alphabetical, i.e. m before o). In other words, the table having many rows with the
same value goes first. The column selected in the second table must have only one row for
each value.
Configuring the relationships shown above is very easy to do.
Change the view to the diagram view.
The tables will appear in the display window.

Relationships can be created by dragging and dropping your cursor from one table field to
another.

132

The arrow will point in the direction of the table that has the fewer references. Note that in
the example above, we can create a relationship either on the Unit ID fields or Unit
Name fields. If the Unit Name is not unique in the Fleet Generation database, then the
relationship should be created using the unique Unit ID identifier.
If you want to verify the fields the relationship was built upon, the relationships can be
reviewed in greater detail through the Manager Relationship option under the Design tab in
PowerPivot.

All built relationships should be listed.

Select the relationship of interest and select Edit.

133

Introduction

Relationships can also be configured directly from any PowerPivot table by right-clicking on
any column header and choosing Create Relationship from the menu.

Upon selection, the create relationship display to allow relationship definition.

134

9.1.1 Exercise Establishing table relationships


This solo or group exercise is designed to maximize learning in a specific topic
area. Your instructor will have instructions, and will coach you if you need
assistance during the exercise. Please try to solve the problem presented without
using the Solution Guide.

Objective: Complete the relationships between tables within PowerPivot in order to add
dimensions to complete the data cube.

Approach:

Determine the fields to relate the remaining "Zip Codes and Generation Rates
tables.

Expected completion time: 10 minutes

135

Introduction

9.3 Adding calculated columns using Data Analysis Expression Language


(DAX)
At this point, it is necessary to begin the discussion on the Data Analysis Expression
Language or DAX. As you will see, DAX provides users with the ability to extend data
initially imported into PowerPivot and is really a key differentiator between PowerPivot and
traditional, server-based BI applications. DAX has two uses: it can be used to add new
columns to tables and it can be used to create measures.
At the far right of every PowerPivot table is an empty column with the heading Add Column.
New columns of data are entered by selecting this column and typing in DAX formulas in the
formula dialog above the displayed PowerPivot table. Alternatively, you can choose any
column, right-click on it, and select Insert Column.
Note: Configuring a DAX calculation is very similar to adding calculated cells in Excel, except in
PowerPivot adds calculated columns (i.e. calculations that take place on every row, or value of the
dataset).

The (DAX) language is a new formula language that allows users to define custom
calculations in PowerPivot tables (calculated columns) and in Excel PivotTables (measures).

136

Like Excel formulas, to create a DAX formula, you type an equal sign, followed by a function name or
expression, and any required values or arguments.

Some differences include:

DAX cannot reference only a few cells or a range of cells; DAX always works with
complete columns or tables.
If you want to use only particular values from a table or column, you can add filters
to the formula.
If you want to customize calculations on a row-by-row basis, PowerPivot provides
functions that let you use the current row value or a related value to perform
calculations that vary by context.
DAX includes a type of function (measure) that returns a table as its result, rather
than a single value. These functions can be used to provide input to other functions,
thus calculating values for entire tables or columns.
Some DAX functions provide time intelligence, which lets you create calculations
using meaningful ranges of dates, and compare the results across parallel periods.

9.4 Where to Use Formulas


The same formula can behave differently depending on whether the formula is used in a
calculated column or a measure. You can use DAX formulas either in PowerPivot tables, or in
PivotTables in Excel:

You can use formulas in calculated columns, by adding a column and then typing an expression
in the formula bar. You create these formulas in the PowerPivot window. Applied to every row
in a column, but varies based on context.
You can use formulas in measures. You create these formulas in Excel, by clicking Add Measure
in an existing PowerPivot PivotTable or PivotChart. The design of the PivotTable and the choice
of row and column headings affects the values that are used in calculations.
Where DAX formulas differ from Excel formulas is that DAX functions work with tables and columns,
not ranges, and let you do sophisticated lookups to related values and related tables.

137

Introduction

9.4.1 Directed Activity Create a Total Hourly Emissions Calculation


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.

Objective:
Create a Total Hourly Emissions calculation in the Unit Performance table.
Approach:

Open the PowerPivot Window.


Select the Unit Performance Table.
Navigate to the end of the table to an open column.

Start typing the formula into the formula reference area.

Total Emissions Table


Total Hourly Emissions Unit hourly emissions. Format: Decimal Number
= [Total Hourly Gross Generation] * RELATED('Unit Specifications'[Carbon
Emissions])
= MWh * g / kWh * (1000 kWh / MWh)

The RELATED function is required when you are including columns from another
table in your DAX equation.

DAX calculations follow a simple syntax. To specify a table column use, Table
Name[Column Name].

Table names can be omitted if they refer to columns in the same table, similar to
omitting worksheet names in Excel when creating calculated cells.

Note: Clicking the


icon will give you a list of available DAX functions, many of which are identical
to those offered in Excel.

138

9.4.2 Exercise Create a Cost Calculation


This solo or group exercise is designed to maximize learning in a specific topic
area. Your instructor will have instructions, and will coach you if you need
assistance during the exercise. Please try to solve the problem presented without
using the Solution Guide.

Objective: Add the cost calculation column to your PowerPivot Unit Performance table
using DAX. The RELATED function is required when you are including columns from
another table in your DAX equation.
Do you prefer having the calculations within AF as a formula data reference, or within
PowerPivot? What are some advantages and disadvantages of each?

Approach:

Open the PowerPivot Window.


Select the Unit Performance Table.
Navigate to the end of the table to an open column.
Type the formula to determine the Cost (Total Hourly Gross Generation x Rate).
Set the correct field format.

139

Introduction

10 Building the Fleet Generation Report


Designing a PowerPivot report is an ad hoc experience. This is easily done by dragging and
dropping items from the PowerPivot field list (table columns) into the various report areas
provided by the report configuration dialog. Moreover, the best part is that you can always
open a new Excel worksheet to create additional reports for a different purpose, or to try
something new, from the same data.

10.1 Creating PowerPivot tables


To create a new pivot table report in Excel, select the Insert ribbon and select the Tables
dropdown menu.
You will see that there are several combinations of
PowerPivot tables and charts to select from.
Select the first option Pivot Table. Specify the external
data source from the Tables in Workbook Data Model.
The cell reference shown becomes the upper left-hand
corner of the pivot table. Check Existing Worksheet or
New Worksheet in the Create PivotTable dialog, then
click OK.
The PowerPivot field list should now be shown in your
Excel worksheet and a range of the worksheet should be
designated where the pivot table will be located, as shown
below.

140

Taking a look at the PowerPivot Field List, at the top there is a


hierarchy that shows the PowerPivot datasets and when they are
expanded there is a list of available columns.

Down below is the dialog area where items can be dropped to


configure the report.

The Values area, in the lower right, is the aggregation area.


Default aggregation for numeric values is SUM.
Default aggregation for non-numeric values is COUNT (number of rows).
For numeric items, you can change the aggregation method by clicking the down arrow on
and choosing Value Field Settings.
In the Measure Setting you can pick a different aggregation or change the name for display
purposes. In this example, it makes more sense to aggregate the Utilization and
Generating Efficiency columns as averages not sums.

141

Introduction

The Column Labels and Row Labels labels areas are for
specifying which items are to be used for the column and row
headers of the table, commonly known as dimensions.
Drag and drop the Region field from Unit Specifications to the
Rows field and Station from the Unit Specifications to the
Rows Field underneath the Region field.
If you place more than item into the Rows Labels or
Column Labels areas, they become nested in the report as
sub headings for either columns or rows. You can change the
nesting order by dragging items up and down within the area to reorder them. PowerPivot
will take care of changing the report.

142

So far, we have some numbers and locations in our report:

Upon definition, the slicers consisted of a set of buttons that users can select, or multi-select
to set the report filters, and they can either be oriented vertically or horizontally on the
worksheet.
Slicers can be oriented vertically or horizontally. The option to insert slicers is located under
the Analyze tab of the PowerPivot Tools menu.

After some dragging and dropping, we can quickly configure the report shown below to
summarize generation by region and station. You will notice that the table is configured
dynamically as you drop items into different areas. Feel free to play around to get things
right, the way that is most useful for you.

143

Introduction

10.1.1 Exercise One Version of the Truth


This solo or group exercise is designed to maximize learning in a specific topic
area. Your instructor will have instructions, and will coach you if you need
assistance during the exercise. Please try to solve the problem presented without
using the Solution Guide.

Objective:
Throughout the company, reports are generated and reviewed at meetings. You have been
requested to generate a report that all regions can use for production meetings so all managers
are referencing data in similar context.
Which region is producing the most power? How does this change for different operators?

Approach:
1. Review production management requirements.
2. Verify all 4 tables are in PowerPivot and the relationships are built.
3. Create a PowerPivot table that details the Stations and Regions and the power
being generated.
4. Add Slicers to the report for Technology and Operator.
Estimated time 20 minutes.
Table with Slicers:

144

10.2 Formatting tips


There are a few pivot table formatting tips that are worth mentioning at this point.
First, right-clicking anywhere in the pivot table will present a menu. Selecting PivotTable
Options from this menu will open the dialog shown here.
Several useful formatting options can be found under the following tabs;

Layout and Format Clearing


the Autofit checkbox will allow
you to set cell column widths the
way you want. PowerPivot will
resize these automatically if you
leave this checked.

145

Introduction

Totals and Filters If it can,


PowerPivot will always try to
provide an extra column and row
for grand totals. You can turn
either of them off here if they are
not what you want.

Display PowerPivot will


collapse rows or columns
when no data is shown. This
means the pivot table may
shrink or grow as users select
slicers. Keeping the table the
same size may make things
easier to compare. You can
select these items to display
blank rows and columns.

146

Secondly, when designing your report, you may want to


reformat the numeric values displayed. The best way to do
this is to format the measure that generates the values.
Right-click on any value you want to format and select
Value Field Settings to display the dialog. Select the
Number Format button and format the value the way you
are accustomed to in Excel.

From the Number Format button, you can also specify the
precision of the aggregation.

147

Introduction

10.2.1 Exercise Consistent Table Layout


This solo or group exercise is designed to maximize learning in a specific topic
area. Your instructor will have instructions, and will coach you if you need
assistance during the exercise. Please try to solve the problem presented without
using the Solution Guide.

Objective:
Modify the layout of the report to reduce the adjustments required by the viewers of the
report.
Approach:

Review the report by selecting different slicing options observe how the
report changes. Is it visually appealing?

Modify report to reduce the adjustments the eyes have to make based on
filtering, collapsing of rows and columns, number format, etc.

Estimated Time: 10 minutes

148

10.3 PowerPivot charts


Now that we are familiar with pivot tables, let us add a chart to your PowerPivot report; this
is very similar to adding a table. From the PivotTable item on the PowerPivot ribbon,
select Pivot Chart.
Check Existing Worksheet (the worksheet containing the PowerPivot Table) in the Create
PivotChart dialog. The cell reference shown becomes the upper left-hand corner of the pivot
chart. Then click OK.
Pivot charts are like any other Excel chart in that they can be dragged or resized anywhere in
the worksheet.
To configure the pivot chart shown in this example, items from the PowerPivot Field List
were dragged into the appropriate dialog area, just as they were for the pivot table. The
completed chart configuration is shown below. PivotCharts can be formatted using the
standard Excel chart formatting tools for type of chart, color scheme, etc.

149

Introduction

Captions and
filter drop downs
can be removed
through the
PivotTable
Options.

150

10.4 DAX Time Intelligence


As we saw in the last exercise, PowerPivot will sort things numerically, alphabetically, or
chronologically when they are displayed in your report.
Since there is currently no field in our data cube for the day of the week, we can utilize DAX
functions extract this information from the timestamp in the Unit Performance table.
One particularly useful function is FORMAT, which returns a timestamp to text in a specified
time format. Below are several time-based DAX time formats that can be utilized in
conjunction with a timestamp. The proper syntax is FORMAT([Time], [Format]).
As an example, FORMAT([Time], mm) will return the minute from a given timestamp.

Format
d
dd
ddd
dddd
M
MM
MMM
MMMM
h
hh
m
mm

Description
Displays the day as a number without a leading zero.
Displays the day as a number with a leading zero.
Displays the day as an abbreviation.
Displays the day as a full name.
Displays the month as a number without a leading zero.
Displays the month as a number with a leading zero.
Displays the month as an abbreviation.
Displays the month as a full month name.
Displays the hour as a number without leading zeros.
Displays the hour as a number with leading zeros.
Displays the minute as a number without leading zeros.
Displays the minute as a number with leading zeros.

Other useful DAX time functions are listed below:


Function
Weekday
Weeknum
Date

Description
Returns the number identifying the day of the week.
Returns the week number within a year.
Returns the date given integer representations for the year, month, and day.

More date formats and DAX functions can be found at the link below:
http://office.microsoft.com/en-us/excel-help/custom-date-and-time-formats-for-the-formatfunction-dax-HA102837261.aspx
http://technet.microsoft.com/en-us/library/ee634550.aspx

151

Introduction

10.4.1 Exercise Create DAX Calculations, Relationship and Sort


This solo or group activity is designed to maximize learning in a specific topic area.
Your instructor will have instructions, and will coach you if you need assistance
during the activity. Please try to solve the problem presented without using the
Solution Guide.
Objective: In the Fleet Generation report we would like to add two Pivot Charts to
compare the generation based on the day of the week and the hour of the day.
What time of day has the largest generation? What day of the week has the largest
generation?

Approach:

Create two DAX calculations to determine the weekday and hour from the time field.
Create two PivotCharts for the total gross generation segmented by the day of week
and by the hour of the day for all the operators.

Estimated time 20 minutes.

152

10.5 Limit Data Viewed by Customers


Columns from the tables in the PowerPivot database can be hidden from reports developed.
For instance, you may want to hide columns that include things you will not need in your
pivot tables or charts. Extra things may add clutter to the pivot table field list when you or
other users are building reports in Excel.
Earlier, we brought the Unit ID into the PowerPivot report. This field would make no sense to the end user
as the value is a random string containing numbers and letters. To hide these fields from the end user, go
to the PowerPivot Manage view, select the Unit Performance sheet, right-click on the Unit ID field, and
select Hide from Client Tools. This step will need to be performed for the Unit Specifications table as
well.

153

Introduction

10.6 Slicers
As previously mentioned, slicers make selecting report filters an intuitive experience for
users. Slicers are created by PowerPivot whenever an item is dragged into the vertical or
horizontal Slicers areas in the PowerPivot Field List dialog. One feature of slicers that
makes them particularly useful is that they are aware of the relationships behind them. As
users make their sections in one slider group, buttons on the associated slicer groups will be
disabled if their selection will result in an empty dataset. This feature helps lead user to
successful analytic results and avoids wasted time in making selections that result in an
empty pivot table.
Slicers are always initially connected to the pivot table or chart from which they were
created. However, we often need to have a combination of several pivot tables and charts
within the same workbook to make our report effective. In many cases we need to have one
slicer act as a filter for more than one of these charts or tables.
The chart needs to be filtered by the Technology slicer.
If we were to configure the slicers for the chart, like we did for the PivotTable, PowerPivot
would generate two additional slicers in the report. These slicers would only control the chart
and the previously configured slicers would only control the table.
The existing slicers need to be configured to filter either the table and/or the chart.
Select the slicer to edit, then right click on box. Select the Report Connections option to
edit the items controlled by the selected slicer.

The dialog will display the options associated with the slicer and contained within the
working spreadsheet.

154

For our report, the Technology slicer will only connect to the table. However, for the
Operator slicer, we want to filter both the table and chart.

For pivot charts, PowerPivot actually creates a pivot table on another worksheet to support
the chart. In this example, the worksheet named Data for Weekday Gen Chart has been
created in the spreadsheet. This is where we need to connect the slicer to filter the chart. We
could select any of the other report tables, even though they are on other worksheets in the
spreadsheet.
If you are not sure what the name of your pivot table is, you can access it from the
PivotTable Tools menu, which appears anytime you select a cell in the pivot table. The
table name is shown at the far left of the ribbon. A good rule of practice is to give all of your
pivot tables and charts meaningful names so you can locate them easily when configuring
slicer connections.

155

Introduction

10.6.1 Exercise Add Slicers; make connections to chart/table


This solo or group activity is designed to maximize learning in a specific topic area.
Your instructor will have instructions, and will coach you if you need assistance
during the activity. Please try to solve the problem presented without using the
Solution Guide.

Objective: Reusability and ease of interpretation is essential.


Which technology has the greatest impact on the power generation?

Approach:
1. Review the set-up of the slicers.
2. Connect the Technology slicer to the Chart.
Estimated time: 10 minutes

156

11 Exploring the Data with PowerView


PowerView provides an intuitive and interactive way to perform data analysis on data stored
in a PowerPivot spreadsheet.
PowerView is based on Microsofts Silverlight technology, which gives us auto scaling
functionality to fit nicely in any size browser window.
With PowerView you can interact with data:

In the same Excel workbook as the Power View sheet.


In data models in Excel workbooks published in a PowerPivot Gallery.
In tabular models deployed to SQL Server 2012 Analysis Services (SSAS) instances.

In Excel, PowerView sheets are part of an Excel XLSX workbook.


Select the item of interest from the PowerPivot database.
Insert PowerView

All visualizations start with a table which can be started by simply dragging and dropping
fields to the table area.

157

Introduction

In PowerView, you can quickly create a variety of data visualizations, from tables and matrices to bar,
column, and bubble charts, and sets of multiple charts. For every visualization you want to create, you
start on a Power View sheet by creating a table, which you then easily convert to other visualizations, to
find one best illustrates your data.

158

To create a table, click a table or field in the field list or drag a field from the field list
to the view. Power View draws the table in the view, displaying your actual data and
automatically adding column headings.
To convert a table to other visualizations, click a visualization type on the Design tab.
Depending on the data in your table, Power View enables and disables different
visualization types to give you the best visualization for that data.

TIP To create another visualization, start another table by clicking the blank view before selecting fields
from the fields section of the field list.

With a data item specified, a map can be added in PowerView use Bing map tiles, so you can zoom and
pan as you would with any other Bing map. Adding locations and fields places dots on the map. The
bigger the value, the bigger the dot. When you add a multi-value series, you get pie charts on the map,
with the size of the pie chart showing the size of the total.

159

Introduction

In a way, configuring reports in Power View is similar to PowerPivot. The tables and
columns of the PowerPivot database you are reporting against are shown in the list on the
upper right. Dragging down in to the fields below allow you to configure the report.

To add a second view to this report, select Power View under the
Power View or Insert ribbons. You should be brought to a blank
Silverlight canvas, ready to configure.

160

Start by dragging the utilization,


efficiency and station columns into the
Fields list in the lower right-hand corner
of the window. Convert the utilization
and efficiency aggregations to be
averages instead of sums, as is the
default (use the down arrow icon located
next to each measure to do this). As you
add columns, you should see a table
being built on the canvas. Dont worry
about the order of things, it wont matter.

All chart configurations start as tables. You can


decide to leave them that way, or you can change
them to any of several different visualization
objects. To make a change, choose one from the
Chart Tools Design tab. Charts can be changed at
any time into anything. Power View will take a
look at the data types provided in your
configuration and will only enable design choices
that fit the data you have chosen.

Picking a vertical cluster for the table shown


above will result in the bar graph shown below.

161

Introduction

12 Final Exercise: Create PowerView and PowerPivot


reports
Objective:
Determine the carbon footprint of each unit and display on a US map. Create a PivotChart to
analyze downtime events.
Approach:

The table below contains sample geospatial information for all units in Fleet
Generation. The full table of data is located in the OtherTables Excel Workbook
inside the Student Files folder. This data will need to be imported into the data cube.
Hint: This step can be accomplished through AF Tables or Linked Table in Excel.
City
Albertsville
Albertsville
Beryl Ridge
Beryl Ridge
Brick Canyon
Brick Canyon
Carbondale
Carbondale
Wolverine Station

162

Unit
Longitude
Latitude
45.267102
-93.742036
GAO02
45.267820
-93.741779
GAO01
41.955105
-91.542975
BCU02
41.957673
-91.542685
BCU01
39.317543
-80.163257
PLT02
37.722195
-89.225657
PLT01
37.723976
-89.224662
TCB06
34.066253
-86.303215
TCB05
45.267102
-93.742036
ALX01

From within SQL Commander, create a View for Inactivity and Temperature
Anomaly Event Frame data. (Hint: Create an Event Frame transpose function against
the Inactivity and Gas Turbine Temperature Anomaly templates. You will also need
to find the UnitID field for the event frames for the PowerPivot table relationships.
Look for PrimaryReferencedElementID)

From PowerPivot create relationships


o between the unit specifications table and the longitude/latitude table (city)
o between Inactivity / Gas Turbine Temperature Anomaly (Referenced
Element ID) and Unit Specifications (Unit ID) Note: The names are
dependent upon the field names in the views.

Insert a map within PowerView to display the region of each of the units and the associated total
hourly carbon emissions.

Create a new spreadsheet and insert two PowerPivot charts for the number of event frames and
average duration of event frames.

Customize the display to make it more user friendly for later use and report generation.

13 Scripting in PI ProcessBook (Optional)


This is an advanced clients course and an assumption of understanding PI ProcessBook
displays is expected even though a display is being provided for this section.

13.1 Scripting in PI ProcessBook


In the 1990s, most applications for the personal computer were still in their formative stages
and standards were few and far between. Microsoft changed much of that by driving
standards through their Windows operating system. One of the standards that begged to be
created was a scripting standard.
Fortunately, for users they did not develop a new and different standard for scripting within
applications. Microsoft drew upon a proven standard and implemented that in their Office
Suite of tools. Thats how Visual Basic for Applications (VBA) came to be. Based on Visual
Basic, both experienced developers and novices could develop effectively.
This also allows the use of calling ProcessBook from external applications, or calling external
applications and controls from within ProcessBook, such as ActiveX controls.

You want to be able to use a calendar dropper to select start and end times for a trend. Since
PI ProcessBook does not have this functionality, you will have to use an ActiveX control
object to do this.

163

Introduction

13.1.1 Directed Activity Add ActiveX controls to a Display


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.

Objective:
Allow the time range of a trend to be changed by date and time picker control objects.
Approach:
1. Create a new PI ProcessBook display file
2. Verify the display is in build mode
3. Add a trend for tag CDT158
4. Create two ActiveX controls
of type Microsoft Date and Time Picker Control
6.0
5. Create a Button with action set to Macro.
6. Create a Macro names SetTimeRange
7. Within the Microsoft Visual Basic editor, insert the following code:
Sub SetTimeRange()
Dim StartTime As Variant
Dim EndTime As Variant
StartTime = ThisDisplay.DTPicker1.Value
EndTime = ThisDisplay.DTPicker2.Value

The time ranged are


declared as type
variant and
extracted from the
two date time
pickers (DTPicker1,
DTPicker2).

ThisDisplay.Trend1.SetTimeRange StartTime, EndTime


End Sub
8. Close Microsoft Visual Basic editor and click OK on the button dialogue
9. Enter run mode and select a time range. Click the button to execute the command.

164

13.2 Alarm Sample Overview


A common event used in coding responses in ProcessBook is the Symbol_DataUpdate Event. In this
exercise, an alarm will sound (providing the environment will allow it) when certain conditions are met.
Whenever an update to the symbol occurs, the Symbol_DataUpdate event is triggered and the value
associated with the update is checked against the criterion.
This exercise should produce an illustration similar to the illustration below:

Here is the primary property, method, and event you need to know:

Value_DataUpdate. This event triggers when a new value is received from the PI Data Archive for the
PI tag used in the "Value" object.
The Value object raises a DataUpdate event. This will serve as the trigger for the script.

13.3 Setting Up and Acknowledging Alarms


You can also use the "Properties" dialog, which allows you to rename PI ProcessBook
symbols. To name any PI ProcessBook symbol, right-click the symbol in build mode bringing
the Properties screen into view and give the symbol a more appropriate name.
For example, you might want to rename the default Value5 with a more representative
name, such as "FeedFlow".

To enable the symbols for scripting, select the Enable Scripting item in the menu. (In the
above item, the EnableScript option is grayed out since it is already set to TRUE as seen in
the Properties screen.)

Analyzing PI System Data

13.4 What Triggers Scripts?


So, I have enabled scripting, how can I trigger an alarm? In this example, we plan to use a
specific value for a PI tag to trigger the alarm. As a PI tag is updated, an event occurs called
the DataUpdate event. Each symbol in a PI ProcessBook display has a name associated with
it (as seen above). The symbols have specific events associated with them that can be used to
trigger desired responses. For example, in our display, a PI tag (CDM158) is being
monitored and a routine is triggered as it is updated.
You have a PI ProcessBook display that shows the current value of a tag (CDM158). You want to be able
to have the computer sound an audible alarm or provide a visual warning when the value of the tag goes
into a certain state (Cascade). Since PI ProcessBook does not have this capability, you will use Visual
Basic for Applications (VBA) and script the solution.
You will use the DataUpdate event to trigger the playing of the sound file. Compare the value sent to PI
ProcessBook to the target (Cascade) state. When they match play the sound.
You will use the file buzzer.wav (already on your system in the C:\StudentFiles\03-ProcessBook\Alarm).
Below is a code snippet that will trigger the alarm. In our display, there are two items being
monitored, a trace in a trend and value symbol. Some familiarity of VBA coding is assumed,
so only specific portions of the code will be discussed.
Private Sub Trend1_DataUpdate(ByVal ntrace As Integer)
Dim time as Variant
Dim status as Variant
Trend1.CurrentTrace = 1
tValue = Trend1.GetValue(time, status)
If tValue = "Cascade" or tValue = "Prog-Auto" Then
AlarmON
End If
End Sub
Private Sub Value1_DataUpdate()
Dim vstatus as Variant
vValue = Value1.GetValue(NOW(), vstatus)
If vValue = "Auto" Then
AlarmON
End If
End Sub

166

The two subroutines to the left


are both DataUpdate events,
but due to the nature of the
symbols, the parameters
required vary. Since trends
can contain one or more traces,
the trace being monitored must
be indicated.
As the specified trace or value
are updated, the respective
routine is triggered and begins
to evaluate the conditions. In
either subroutine, the values
are checked to determine the
value of the PI tag. If the
Trend tag value evaluates to
either Cascade or ProgAuto an alarm will fire off.
The Value symbol will trigger
an alarm if the PI tag
evaluates to Auto.

Analyzing PI System Data

13.4.1 Directed Activity Review VBA of Display


In this part of the class, you will perform a learning activity to explore the different
concepts presented in this chapter or section. You may be invited to watch what the
instructor is doing or perform the same steps at the same time. You may play a game
or hold a quiz. Your instructor will have directions.

Objectives:

Review code structure.


Step into code to review.
Review Results.
Acknowledge Alarms.

Approach:
1. Open Alarm.pdi from the C:\StudentFiles\03-ProcessBook\Alarm\Alarm.pdi
2. Verify the display is in build mode.
3. Open the Visual Basic Editor
4. Create a break point in the code in the trend data update routine. To create a
breakpoint, select a line in the code, other than a Dim statement, by clicking in the
column to the left of the line.

5. Trace code within the routine use the debug application and step through the
program.
Note: While the code window is active, push F8 to step through the code.

6. Review Acknowledge routine section.

167

Analyzing PI System Data

13.5 The Trend_TimeRangeChange Event


Anything a user can do, a script can do (if you've exposed the right set of properties and methods of
your application). When a user finally zooms in on the part of a trend that is really important, they
usually have to stop and change all the other trends to show the same interval.
This exercise demonstrates how to do it programmatically. The result should produce something similar
to the following illustration:

Trend1

Trend2

Here are the properties, methods, and events you need to know:

Trend_TimeRangeChange. The TimeRangeChange event of the Trend object triggers when the user
changes the timerange of the trend. It returns two string variables (starttime and endtime) that
represent the new start and end time of the trend.

Object.SetTimeRange. This method sets the time range of the object. The object is typically a trend, a
bargraph, a value, or a display.

1. The Trend object raises a TimeRangeChange event.


1.1. This will serve as the trigger for the script.
2. The TimeRangeChange event returns a starttime and endtime variable.
3. The Trend object supports a SetTimeRange method that sets the time to the starttime and endtime
passed to it. If you are used to intellisense, then be wary that the default syntax is not always
acceptable. Try removing the parenthesis ( )

168

Analyzing PI System Data

13.5.1 Sync Trends Time Ranges

This solo or group exercise is designed to maximize learning in a specific topic


area. Your instructor will have instructions, and will coach you if you need
assistance during the exercise. Please try to solve the problem presented without
using the Solution Guide.

Objective:
Multiple Trends are contained in a display used by the control room. When reviewing the
trends, the operators must manually update their time ranges. Complaints regarding the
display need to be stopped by calling the TimeRangeChange routine when a trends time
range is changed.
Approach:

Start with the Alarm Display.


Add at least one new trend to the display.
Create a routine to synchronize the time ranges of the trends.
Below is an example of code to update an additional trend in the Alarm Display.
Please be aware the number of statements you generate will vary based on the
number of additional symbols added to the display that are to be updated with the
new time range.

Note: It may be necessary to Enable Scripting on the symbol before it is seen in the code.

Private Sub Trend1_TimeRangeChange(ByVal StartTime As String, ByVal EndTime As String)

ThisDisplay.Trend2.SetTimeRange StartTime, EndTime

End Sub

Note: The name of the Symbol could vary, so double check the names. There is also a
programmatic way to find all trend symbols in a display, as shown in the next section.

169

Analyzing PI System Data

13.6 The Trend_DropCursor Event


A simple request is to change the timestamps of all the values on a display to match the timestamp of the
current location of the trend cursor. Here is a simple program to do this.

Value

Value1

Value2

Value3

Value4

Trend

Here are the properties, methods, and events you need to know:

Trend_DropCursor. This event triggers when a user drops a cursor on a trend. It returns the string
variable newtime that is the timestamp of the location of the cursor.

ThisDisplay.Symbols. The collection of display objects on a display.

ThisDisplay.Symbols.Count. The number of symbols in the symbols collection.

Symbols.item(i). The "item" method identifies by ordinal or by name a member of a collection.

Symbols.item(i).Type. ProcessBook has many types of symbols. They are defined in the library
PBObjLib. In this example, we test the symbol type against the Class ID pbSymbolValue, a constant
that represents type 7, the Value symbol. (See the online help for VBA in PI ProcessBook at
\\program files\pipc\help\EN\pipbvb.chm for more information. This is also available from inside the
VBA editor in ProcessBook. Open a code page, highlight a ProcessBook VBA property or method,
and press F1 to launch the help file.)

170

Analyzing PI System Data

13.6.1 Making use of PI ProcessBook Constants


This example is the first to use a constant that is defined in one of the PI ProcessBook type libraries. PI
ProcessBook has many constants. (See the online help for VBA in PI ProcessBook at \\program
files\pipc\help\EN\pipbvb.chm for more information. This is also available from inside the VBA editor in
ProcessBook. Open a code page, highlight a ProcessBook VBA property or method, and press F1 to
launch the help file.) They are defined in the libraries PBObjLib and PBSymLib. In this example, we need
to determine if the symbol is or is not a trend. To do this we compare the type property of the symbol to
the Class ID pbSymbolValue, a constant that represents type 7, the value symbol. Here is the code
example:
If Symbols.Item(1).Type <> pbSymbolValue Then
etc....
The class libraries PBObjLib and PBSymLib are accessible within PI ProcessBook because they are
referenced in the Tools>References menu as the "PI ProcessBook Type Library" and the "PI
ProcessBook Symbol Library". These libraries can be explored using the Visual Basic Object Browser by
choosing View>Object browser. Select the libraries PBObjLib and PBSymLib.
If you are writing a script in to use PI ProcessBook objects externally through Visual Basic or VBA in
Excel, Word, or Access, you can make use of these two libraries by checking the appropriate checkbox in
the Project>References (in VB) or the Tools>References (in VBA) dialog box . If the libraries do not
appear in the list of libraries, you can browse for them in \\pipc\procbook\pbobjlib.tlb and
\\pipc\procbook\pbsymlib.tlb.
Many of the class IDs of objects provided by PI ProcessBook are defined in the PBObjLib library. (See
the online help for VBA in PI ProcessBook at \\program files\pipc\help\pipbvb.chm for more information.
This is also available from inside the VBA editor in ProcessBook. Open a code page, highlight a
ProcessBook VBA property or method, and press F1 to launch the help file.) Specific PI ProcessBook
symbols such as the arc, trend, and value are defined in PBSymLib.

171

Analyzing PI System Data

13.6.2 Exercise Dynamic Values with Cursor Syncing


This solo or group activity is designed to maximize learning in a specific topic area.
Your instructor will have instructions, and will coach you if you need assistance
during the activity. Please try to solve the problem presented without using the
Solution Guide.

Exercise Objectives
Open DropCurs_Blank.PDI and add the code that allows the dynamic values to keep in sync with the
trend.

Private Sub Trend_DropCursor(bCancel As Boolean, ByVal nCursor As Integer, ByVal


NewTime As String)
Dim i As Long
For i = 1 To ThisDisplay.Symbols.Count
If ThisDisplay.Symbols.Item(i).Type = pbSymbolValue Then
ThisDisplay.Symbols.Item(i).SetTimeRange "", NewTime
End If
Next i
End Sub

Note: This answer can be found in the file dropcurs1.pdi in the course files. The file dropcurs2.pdi
illustrates an alternate way of selecting the dynamic values, and dropcurs3.pdi illustrates how to use
multiple trends.

172

Analyzing PI System Data

13.7 External ProcessBook Scripting


The scripts in this section manipulate objects externally. For example, there are scripts
written in Excel that manipulate ProcessBook.
13.7.1 The Application Object Allows Relative References
Up to this point all of our scripts referred to the current display as "ThisDisplay," which is
interpreted at run-time to be the display in which the code was written. However, sometimes
you need a script to work with the current display. This is especially true if the script is
manipulating ProcessBook externally from another application such as Excel.
In order to get access to the current display, you can use the ProcessBook Application object
to move up the hierarchy to the top level object in PI ProcessBook. From there you can then
access the ActiveDisplay property to get the current open display.
Here are the properties, methods, and events you need to know:
Application. The current active ProcessBook application. It can also be written as
PBObjLib.Application if you want to explicitly identify the correct Application object.
(This can be useful if you have a project that references many different applications and thus
contains many references to objects called Application.)
Application.ActiveDisplay. The current active ProcessBook display.
ActiveDisplay.Symbols. The collection of display objects on a display.
Here is an example of a script written so that it works with any display that is open.
Sub ListObjects()
Dim i As Integer
For i = 1 To Application.ActiveDisplay.Symbols.Count
Debug.Print Application.ActiveDisplay.Symbols.Item(i).Name
Next i
MsgBox (Application.ActiveDisplay.Symbols.Count & " Symbols")
End Sub
Note: This answer can be found in the file myutils.pdi in the course files.

13.7.2 How to Get Access to ProcessBook from Excel


The steps below create variables in VBA in Excel that are used to set references to objects in
ProcessBook.
Open Excel and press Alt + F11 to open the Visual Basic Editor
Select Tools => References to add references to the editor
Make sure PI-ProcessBook Type Library and PI-ProcessBook Symbol Library are checked

173

Analyzing PI System Data

Edit the code in CommandButton1_Click. Do this by choosing View>Code.


Enter the code to assign variables as stand ins for certain PI ProcessBook objects.
Dim
Dim
Dim
Dim

appX
pbkY
tndZ
tag1

As
As
As
As

PBObjLib.Application
PBObjLib.ProcBook
PBSymLib.Trend
String

AppX, pbkY, and tndZ are variables that will be used as stand ins for PI
ProcessBook objects. As such they are defined as the proper object types from the PBObjLib
and PBSymLib type libraries.
Use an if/then statement to determine whether or not ProcessBook is running. In either event,
set a reference to ProcessBook Application object in the variable appX.
On Error Resume Next
Set appX = GetObject(, "piprocessbook.application.2")
On Error GoTo errHandler
If appX Is Nothing Then
Set appX = CreateObject("piprocessbook.application.2")
End If

The set command is a keyword used to create a new instance of an ActiveX Automation
object. In this case, the variable appX is being created as an instance of a PI ProcessBook
object. We will later use appX to drill down through the entire ProcessBook object model.
Notice that if the GetObject fails, the CreateObject is invoked. We do this so that we can
determine first if ProcessBook is already running by using GetObject. If GetObject fails, we
can be certain that ProcessBook is not running, so we launch ProcessBook using
CreateObject.

174

Analyzing PI System Data

Note: You may wonder what the phrase PIProcessBook.application.2 means. This is a
reserved phrase that OSI has chosen to represent the PI ProcessBook class. It is stored in the
Windows registry. Any program that needs to manipulate PI ProcessBook objects must use
this phrase with the GetObject or CreateObject command to locate the proper DLL associated
with PI ProcessBook objects. This is a standard ActiveX Automation technique. For example,
if you want to manipulate Excel objects you also use GetObject with a specific reserved
phrase:
Set xlobj = GetObject(, "Excel.Application")
Create a Display to Trend Selected Tags

The previous two examples used existing ProcessBook displays. This example creates a new
display. When the user clicks on the Create a Display button, PI ProcessBook will
launch (if it is not already launched) and create a new display named MyNewDisplay. The
Visual Basic script generates a new trend using the tag name selected from the cell by the
user.
This is new to the example:
Displays.Add. The method to create a new display. If you intend to keep the display you
should invoke the Display.Save or .SaveAs method.
Note: You can use the file PBExtScr_blank.xlsx in the course files as a template. This file that contains all
the objects (like trends, command buttons etc.) but no VBA code. Use this to try to complete the exercise
without looking at the solution. This example is Button1_Click() in Module1

175

Analyzing PI System Data

Solution
Private Sub Button1_Click()
Dim appX As PBObjLib.Application
Dim disY As PBObjLib.Display
Dim tndZ As PBSymLib.Trend
Dim tag1 As String
' If ProcessBook is not already open, launch it.
On Error Resume Next
Set appX = GetObject(, "piprocessbook.application.2")
On Error GoTo errHandler
If appX Is Nothing Then
Set appX = CreateObject("piprocessbook.application.2")
End If
'Add a new display to the displays collection.
Set disY = appX.Displays.Add("MyNewDisplay")
'Copy the tag name from the active cell in the spreadsheet.
tag1 = ActiveCell.Value ' Take tag name from the active cell.
' You have the option to concatenate the node name to the tag name.
' using the \\servername\tagname format. Use this if you want to
' trend a tag that is not on the default PI Data Archive.
'tag1 = "\\localhost\" & tag1
'Add a new symbol of the appropriate type (trend) to the symbols collection.
'Note: we can use pbSymbolTrend because we made reference to the
'PBSymLib and PBObjLib libraries in Tools>References.
Set tndZ = disY.Symbols.Add(pbSymbolTrend) ' Adds a trend symbol to the display.
'Change the shape and location of the new trend to fit the display.
tndZ.Top = 15000
tndZ.Left = -15000
tndZ.Height = 200
tndZ.Width = 1000
tndz.Maximize True
'Add a trace to the trend.
tndZ.addtrace (tag1) ' Add the tag to the trend created above.
disY.Modified = False
' Bring Processbook to foreground
AppActivate ("PI ProcessBook")
Exit Sub
errHandler:
MsgBox ("Error occurred: " & Err.Description)
End Sub
Note: This answer can be found in the file PBExtScr.xlsx in the course files under the Button1_Click
event.

176

Analyzing PI System Data

Appendix A Substitution Parameters


Defining the Substitution Parameters
The substitution parameters are listed in the following table. The ones in bold are the commonly used
Name substitution parameters.

Parameter

%..\Element%

%..|Attribute%

%@Attribute%

%\Element%

%<Environment Variable>%

%Analysis%
%Attribute%
%AttributeId%
%Database%
%Description%
%Element%
%ElementDescription%
%ElementId%
%EndTime%

Will be replaced by this objects


name:
The name of the owning element of
the element in which the attribute
resides. To retrieve further ancestors,
use the '..\' notations, such as
%..\..\Element%.
The name of the owning attribute in
which the attribute resides. To retrieve
further ancestors, use the '..|' notations,
such as %..|..|Attribute%.
The value of the attribute referenced.
To retrieve further ancestors, use the
'..|' notations, such as
%@..|..|Attribute%.
The name of the root AF Element in
which the attribute resides.
The matching System Environment
Variables value. For example
%COMPUTERNAME% is replaced
with the name of the computer on
which the Data Reference is
executing.
The name of the analysis if it can be
obtained from the context.
The name of the attribute that holds
this data reference.
The attribute ID that holds this data
reference.
The name of the AF Database in
which the attribute resides.
The description of the attribute that
holds this data reference.
The name of the AF Element in which
the attribute resides.
The description of the element in
which the attribute resides.
The element ID that holds this data
reference.
The local end time if it can be
obtained from the time context.

177

Analyzing PI System Data

%Model%
%Server%
%StartTime%
%System%

%Time%

%UtcEndTime%

%UtcStartTime%

%UtcTime%
.\

[.]

[@filter=text]

[@Index=#]

178

The name of the model if it can be


obtained from the context.
The name of the default PI Data
Archive of the AF Database in which
the attribute resides.
The local start time if it can be
obtained from the time context.
The name of the PI System in which
the attribute resides.
The local time if it can be obtained
from the time context.
The coordinated universal (UTC) end
time if it can be obtained from the
time context.
The coordinated universal (UTC) start
time if it can be obtained from the
time context.
The coordinated universal (UTC) time
if it can be obtained from the time
context.
The current reference
The default object of the parent
collection. For example .\Elements[.]
|Temperature returns the temperature
attribute from the primary element of
the current references Elements
collection.
The search string in text (e.g. Tank*)
matches the given filter. Supported
filters are: @Name, @Index,
@Template, @Category,
@ReferenceType, @Description,
@Type, @UOM.
Returns the result at location # from
the collection result.

Analyzing PI System Data

Appendix B Additional PowerPivot Resources


There are many other resources available to learn more about PowerPivot. Here are just a few.
OSIsoft Resources

OSIsoft Users Conference www.osisoft.com

PI Developers Club pisquare.osisoft.com/community/developers-club

PI T&D Users Group Site extranet.osisoft.com

For SRP Customers learning.osisoft.com


Microsoft Resources

PowerPivot download http://www.microsoft.com/en-us/bi/powerpivot.aspx

Windows Azure Marketplace https://datamarket.azure.com/


Helpful Books

PowerPivot for the Data Analyst, Bill Jelen

Practical PowerPivot & DAX Formulas for Excel 2010,


Art Tennick

179

Analyzing PI System Data

Appendix C Performance Equation Operands and Functions


Taken from the PI Data Archive Application User Guide

Operands in Performance Equations


Operand Type

Syntax Requirements

Examples

Numbers
Tagnames

(none)
In single quotes

PI Time Expressions
Strings
Functions

In single quotes
In double quotes
Must be a Performance
Equation function

1342 98.6 .0015 1.2e2


'sinusoid' 'ba:level.1'
'ba.phase.1'
'01-dec-03' '16-jul-94' '*'
"string string string" "sinusoid"
TagVal('sinusoid')
TagAvg('sinusoid')
Cos('sinusoid')

Functions Listed By Type


The following tables list all functions by type. This list can also be found in the
PIPC\HELP\PEReference.chm help file.

Math Functions
Name

Description

Abs
Asin
Acos
Atn
Atn2
Cos
Cosh
Exp

Absolute value
Arc sine
Arc cosine
Arc tangent
Arc tangent (two arguments)
Cosine
Hyperbolic cosine
Exponential

Float
Frac
Int
Log
Log10
Poly
Round
Sgn
Sin
Sinh
Sqr
Tanh

Conversion of string to number


Fractional part of number
Integer part of number
Natural logarithm
Common logarithm
Evaluate polynomial
Round to nearest unit
Numerical sign
Sine
Hyperbolic sine
Square root
Hyperbolic tangent

Tan
Trunc

Tangent
Truncate to next smaller unit

180

Analyzing PI System Data

Aggregate Functions
Name

Description

Avg
Max
Median
Min
PStDev

Average
Maximum
Median selector
Minimum
Population standard deviation

SStDev
Total

Sample standard deviation


Sum

Miscellaneous Functions
Name

Description

BadVal
Curve
DigState
IsDST

StateNo

See if a value is bad (not a number or time)


Get value of a curve
Get digital state from a string
Test whether a time is in local daylight
savings time period
Test if a PI value is annotated, substituted, or
questionable
The code number of a digital state

TagBad

See if a point has an abnormal state

IsSet

PI Archive Retrieval
Name

Description

NextEvent
NextVal
PrevEvent
PrevVal
TagVal

Time of a point's next Archive event


Point's next value after a time
Time of a point's previous Archive event
Point's previous value before a time
Point's value at a time

PI Archive Search
Name

Description

FindEq
FindGE
FindGT
FindLE
FindLT
FindNE
TimeEq
TimeGE
TimeGT

Timestamp when point = value


Timestamp when point >= value
Timestamp when point > value
Timestamp when point <= value
Timestamp when point < value
Timestamp when point ~= value
Total period when point = value
Total period when point >= value
Total period when point > value

TimeLE

Total period when point <= value

181

Analyzing PI System Data

TimeLT
TimeNE

Total period when point < value


Total period when point ~= value

PI Archive Statistics
Name

Description

EventCount
PctGood
Range
StDev
TagAvg

Number of Archive events


Percent of good time in a period
Range of minimum to maximum value
Time-weighted standard deviation
Time-weighted average

TagMean
TagMax
TagMin
TagTot

Event-weighted average
Maximum value in a period
Minimum value in a period
Time integral over a period

Point Attributes
Name

Description

TagDesc
TagEU

Get a point's descriptor


Get a point's engineering unit string

TagExDesc
TagName
TagNum
TagSource
TagSpan
TagType
TagTypVal
TagZero

Get a point's extended descriptor


Get a point's name
Get a point's ID
Get a point's point source string
Get a point's span
Get a point's type character
Get a point's typical value
Get a points zero value

Time Functions
Name

Description

Bod

Timestamp for beginning of the day for given


time
Timestamp for beginning of the month for
given time
Timestamp for first of the next month for
given time
Day of the month from a time
Seconds since midnight from time
Hour from a time
Minute from a times
Month from a time
Timestamp for local noon of day of a times
Convert character string to time

Bom
Bonm
Day
DaySec
Hour
Minute
Month
Noon
ParseTime

182

Analyzing PI System Data

Second
Weekday
Year
Yearday

Second from a times


Day of the week from a times
Year from a time
Day of the year from a time

String Functions
Name

Description

Ascii
Char
Compare

ASCII character code for a character


String for ASCII character code(s)
Wild comparison of two strings

DigText
Format
InStr
LCase
Len
Left
LTrim
Mid

Text for a digital state


Formatting of a numerical number
Instance of a sub-string
Conversion of all characters to lower case
Length of a string
First characters in a string
Removal of blanks on the left side of a string
Extraction of a sub-string from a string

Right
RTrim
Trim
UCase

Last characters in a string


Removal of blanks on the right side of a string
Removal of blanks on both sides of a string
Conversion of all characters to upper case

String Conversion
Name

Description

Concat
String

Concatenate two or more strings


String representing any PI value

Text

Concatenation of strings for a series of PI


value arguments

183

Analyzing PI System Data

Appendix D PI SQL Commander Table Relationships

184

Вам также может понравиться