Вы находитесь на странице: 1из 3

April-June 2005

Automating the Cognos PowerPlay Cube Building Process


Brian Morris, Cognos Architect / Administrator, Kerr-McGee Corporation, Oklahoma City, Oklahoma

Editor’s note: This article is based on Brian Morris’ award-winning presentation at the Cognos Forum held in June 2004
in Florida. Morris was voted People’s Choice winner as the top presenter at the Forum.

The topic I chose to present at the Cognos Forum ’04 was Automating the PowerPlay Cube Building Process. This is a topic
that is very near and dear to my heart. In the past three years, my company has gone from building just three cubes to more
than 60 cubes a day. With the vast majority of these cubes being built between midnight and 8:00 a.m., having an effective
automation strategy was critical.
What started with 20 lines of code that simply built and published a single cube has grown into 4,000-plus lines of code that
meet a wide range of business requirements. The business sets these requirements and it is our responsibility as architects and
administrators to design and build systems that meet or exceed them.
My company required cube builds to be initiated in one of three methods. The first method is to initiate the cube build based
on the presence of a flag file, a specific file at a specific location. These flag files are most often created at the end of a data
warehouse load or other ETL process. The second method is to simply schedule cube builds at a given time and/or date using
batch files kicked off by NT Scheduler. The third and most common method at my company is to initiate the cube build after
a specific SQL condition is met. In this third method the condition most frequently checked every 15 minutes during a given
window in time, i.e. from 1 a.m. to 5 a.m.
Another business requirement was to have email/pager notifications built into our cube build process. Notifications of
success, failure, or called-off cube builds are sent both to system administrators as well as to the data/process owners of each
cube. These notifications include a detailed cube build macro log and the PowerPlay Transformer log.
The most important business requirement in my view is to validate cube build success prior to publishing to production. No
cube is published into our production environment until a series of checks on the build are performed. These checks help to
ensure that no incomplete or failed cubes make it to the production environment. Nothing harms business intelligence
initiatives faster than delivering faulty or inaccurate information to the end users.
To implement this strategy, I chose to use the CognosScript macro language. This is very similar to Visual Basic with the
addition of specific Cognos properties and methods. This language comes bundled with PowerPlay Transformer and allows
for 100% customization to our needs.
Using CognosScript, I have developed a 12-step process for building cubes. During and after the Cognos Forum, I was
contacted by more than 200 attendees and fulfilled their requests for CDs that contain working examples of the 12 steps.
Hundreds more copies of this 12-step process have been downloaded at no charge from my Web site (bmorrishome.org).
What follows is a brief description of each of these steps and some of the basic principles I use in our production
environment. The DemoCubeBuildScript.mac macro and associated files can be downloaded using the link at the bottom of
this article. While the script is written in a way that will allow you to adapt it to your environment, I recommend that you
have a good working knowledge of CognosScript if you plan to use the demo macro. Customization will be required
primarily in the section below the embedded comment that reads “' Set the locations and preferences in this section. This
section is the only place you need to modify for building new Transformer Models (PYI files).” Questions about the macro
should be directed to brian@bmorrishome.org and not to Supportlink or Cognos Customer Support.

The 12 Steps

Step 1 Rename Old Macro Log File


As each step is performed, entries are made into a log file. This log file, along with the PowerPlay Transformer cube build
log file, are sent to administrators and data owners at the conclusion of each cube build. To maintain a history of previous
cube builds, all previous log files generated by this macro are renamed. They are renamed by simply adding the file date/time
stamp to the end of the file name. After renaming the old log file, a new one is created to log this current cube build attempt.

Step 2 Flag File Processing Start


If the cube build is contingent on the presence of a flag file, then this step will run. In this step a simple “Dir” command
verifies the presence of the Flag File. If found, the Flag File name is appended with “_InProcess” to indicate that the cube
build process has started and to prevent additional cube build attempts from being triggered by this Flag File. After the Flag
File is renamed, the cube build process continues. In the event that the Flag File is not found, then the cube build process is
“Called-Off”.

Step 3 SQL Condition Processing


If this cube build is contingent on meeting the conditions of an SQL Query, then this step will run. In this step an SQL Query
is run using a System level ODBC database connection. If the query returns the necessary values as specified in the macro,
the cube build process will continue. If the query does not return the necessary values, the cube build process is “Called-Off”.

Step 4 Rename Old Transformer Cube Build Log File


This step renames the old Transformer cube build logs by appending their file date/time stamp to the end of their file name.
This step allows for the keeping of several days of previous logs to use if needed.

Step 5 Back up the Existing Cubes


This moves any old cubes that this Transformer model will be building to a backup file location. During this file move/copy
process, the file date/time stamp is appended to the end of each file to allow multiple copies of the same MDC file to be
maintained as a backup. This step is important because it will allow you to roll back to previous cubes if necessary.

Step 6 Delete Expired Logs and MDC Files


This step deletes log files and MDC files in the backup directory after they exceed the retention period set in days set within
the macro. This is basically a cleanup process that ensures that your hard drive does not fill up with old MDC files while still
allowing you to still keep several days history online. I typically set this at 10 days.

Step 7 Backup the Transformer Model


This step backs up the Transformer model based on file name and date/time stamp. Model backups are never deleted which
allows rolling back to any previous model. This is just another safety mechanism and it is not intended to substitute for a
good change control process.

Step 8 Build from Transformer Model


This step builds the cubes from the Transformer model. There are two different options for these cube builds, native
Transformer object or External Process. The default is to use the native Transformer process as it allows for additional
validation of the cube build success or failure.

Step 9 Cube Validation


This step is an additional set of checks to validate that the cube was built successfully without physically opening the MDC
file. To do this the Transformer log file is opened and each line is evaluated against a set of criteria that includes eight fatal
“TR” code errors. If the log file does not contain any of those errors, passes additional validation, and contains the phrase
“Timing, TOTAL TIME (CREATE CUBE),” then the build is deemed successful. Otherwise the build is labeled a failure and
proper notifications are generated. This step also includes a call to another process that parses the contents of the Transformer
log file into a SQL Server database for more detailed analysis and reporting.

Step 10 Cube Build Move to Production


This step physically moves the cubes from the cube build directory to the folder locations used by the PowerPlay Enterprise
Server (PPES) to respond to PowerPlay Web requests. In this process the cubes are PPES disabled, moved, move verified,
and then PPES enabled. This script allows for up to eight cubes to be published to four PPES servers.

Step 11 SQL Update of Flag Record


Performs a database update to indicate a successful cube build. This prevents the SQL condition from being met again until a
new record is added to the database following a successful data warehouse or ETL process.

Step 12 Flag File Processing Finish


If a Flag File is used as a condition of the cube build, the Flag File “_InProcess” text is removed and the file is left with a
simple file date/time stamp appended to it.
Email Notification Within Each Step
Each step of my process includes tests and sets flags that are used to signal which type of email notification will be sent out.
One of the most important parts of our automation process is the sending of these notifications. Some of the reasons emails
are sent out are:

z Step 3 – A Connection to the SQL Query Database is not successful.


z Step 3 – The SQL query returns no rows or columns.
z Step 8 – The Transformer object returns an error during its cube build.
z Step 9 – The Transformer log contains one of the eight Fatal Transformer “TR” errors.
z Step 9 – The physical MDC file that should have been created by the cube build cannot be found.
z Step 9 – The Transformer log does not conclude with “Timing, TOTAL TIME (CREATE CUBE).”
z Step 10 – The Cube cannot be disabled in PPES.
z Step 10 – The Cube cannot be copied into the PPES Cube folders.
z Step 10 – The Cube file date/time stamp in the PPES Cube folder is not newer than the file date/time stamp in the
cube build folder.
z Step 10 – The Cube cannot be enabled in PPES.
z Steps 1–12 – Any other unexpected failure in the cube build macro.

A structured build and publish process is a critical component of my company’s success in deploying PowerPlay to a large
number of clients. This process is more important than any specific line of code or method of execution. Plan your process
carefully and make it as comprehensive and as flexible as possible.
While there are several thousand lines of code and comments in my process, less than 30 of them need to be changed when
moving a new Transformer model into production. These changes are all made to variables at the top of the macro that
enables rapid deployment and a consistent environment.

Conclusion
Attending the Cognos Forums benefits me and my company in two important ways. The first is to provide me specific
methods on how to accomplish specific tasks. Whether in a session or in talking with another attendee, I continue to learn
new techniques that I bring home and implement. Secondly, and most importantly, I am exposed to new ideas and new ways
of meeting customer needs. Getting a fresh perspective and harvesting the best practices of others at the conference is
invaluable. I always leave the conference with a list of items I want to implement in my environment.

/supported/supportlink/15n2/techreview01_script_demo.zip
mailto:brian@bmorrishome.org

Copyright © 2005 Cognos Incorporated

Вам также может понравиться