Академический Документы
Профессиональный Документы
Культура Документы
EMC Corporation
Corporate Headquarters:
Hopkinton, MA 01748-9103
1-508-435-1000
www.EMC.com
Copyright© 2009 EMC Corporation. All rights reserved.
Published May 2009
EMC believes the information in this publication is accurate as of its publication date. The information is subject to change
without notice.
THE INFORMATION IN THIS PUBLICATION IS PROVIDED AS IS. EMC CORPORATION MAKES NO REPRESENTATIONS
OR WARRANTIES OF ANY KIND WITH RESPECT TO THE INFORMATION IN THIS PUBLICATION, AND SPECIFICALLY
DISCLAIMS IMPLIED WARRANTIES OF MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE.
Use, copying, and distribution of any EMC software described in this publication requires an applicable software license.
For the most up-to-date listing of EMC product names, see EMC Corporation Trademarks on EMC.com.
All other trademarks used herein are the property of their respective owners.
Table of Contents
Preface ................................................................................................................................. 9
Finish button fails to work if the data model has changed ........................... 86
Setting the height of the metadata pane in a folder view .................................. 86
Working with configurable actions ................................................................ 86
Configuring current object actions ............................................................. 86
Configuring selected object actions ............................................................ 87
Assigning current object actions to menus .................................................. 87
List of Figures
List of Tables
This guide is intended to help EMC partners and practitioners use the Documentum Process Suite
products to build repeatable business solutions. It focuses on the best practices in the use of the
process suite products by covering topics such as planning, implementation, performance, and
deployment.
This guide addresses process-based applications and makes recommendations on how to integrate
these products to produce an end-to-end solution. These processed-based applications can be broken
down into a series of steps, in which an individual or software system performs each step. Examples
of process-based applications are ordering a book over the Internet, applying for a checking account,
placing a trip reservation, making an insurance claim, and applying for a loan.
Note: This guide should be used in addition to, and not a replacement for, the user guides that come
with the individual products. This guide also assumes that you are familiar with the Documentum
Process Suite products.
Note: We refer to the recommendations in this guide as best practices. However, most of the
recommendations are really good practices that work under most conditions. Each client is different
in both goals and environment, and not all recommendations work in all situations. As you design
a solution, consider the trade-offs between usability, performance, and time to market, and make
an assessment based on that information. For example, this guide provides several suggestions on
improving performance, these suggestions could be relaxed for small implementations with just a
handful of users. You may determine that it is best to deviate from some of these best practices based
on your trade-off analysis.
Intended Audience
The audience for this guide is the project team that is creating business solutions. In particular,
the primary audience includes Application Development Managers, Partner Project Architects,
and Project Managers.
Revision History
The following changes have been made to this document.
Business processes occur in many contexts: government, law, health-care, insurance, retail,
telecommunications, and other domains. A process orchestrates a combination of human and
automated activities that produce and deliver information. This information can be in the form of
structured data or unstructured content, and the process must be able to manage both types easily
and seamlessly. During the course of the process, users add documents and enter business data. The
process uses this information to route the process to the correct people. The people working in the
process use this process information to make the correct decisions.
One example of process-based solution is Case Management. The case launches one or more
processes to perform its work. The best practices described in this document apply to both case
management solutions and to process-based applications in general. See the Case Management
Solution Framework Grants Management Sample Application and its documentation (Case
Management Solution Framework Grants Management Sample Application Guide) for an in-depth
discussion of case management concepts and practices.
Technology Considerations
Process-based applications blend the human element with the software system element. The human
element is the part that calls for making judgment calls and decisions. The software systems that
manage the information and the process support these decisions.
The solution is based on three technology foundations:
• Business Process Management - to define and orchestrate the series of activities to be performed
• Enterprise Content Management - to manage documents, images, and associated metadata
• Collaboration - to provide the ability for involved parties to interact, share information, and
make joint decisions
In addition, the following EMC products provide vital services to a process-based application
solution:
• Captiva: The product family for transforming business-critical paper, fax, and electronic data
sources into business-ready content suitable for processing by enterprise applications.
• Process Builder: Allows for the creation of process templates. A process template captures the
definition of a business process, enabling users to perform the process repeatedly.
• Forms Builder: An interactive tool used to create and modify form-based templates and store
them in a repository. Forms Builder enables users to design user interface templates for searching,
viewing documents and folders, process initiation, task lists, tasks, and basic forms within the
Documentum Process Suite.
• Process Engine/Process Integrator: The server components that orchestrate processes and
integrate with external systems and data.
• TaskSpace: The easy to configure user interface that can be used for task processing and document
retrieval in the business process. TaskSpace offers integrated document viewing with annotations
and enables the management of work queues. TaskSpace is the user interface for the application.
• Business Activity Monitor (BAM): Provides a performance dashboard for monitoring and
analyzing process-based applications. In real time, BAM tracks defined key performance
indicators (KPIs), reporting on process performance and providing alerts for conditions that fall
outside of designated thresholds. With BAM, the manager can assess the overall performance
level of the application and focus on problems in a specific case.
• Records Management: Provides the means to ensure compliance with legal and regulatory
requirements. The agency can set rules for retention policies to preserve case files for a specified
time. The process automatically triggers this capability.
• Documentum Collaborative Services: Enables teams to work securely and collaboratively within
web applications.
Case Management projects, like other Process Suite applications, are created in a series of phases. The
following diagram illustrates a typical phase structure:
Thorough planning and testing is important for process-based applications and can make a difference
in whether a project succeeds or fails. What makes this possible is the extensive use of templates for
creating processes, forms, reports, and user interfaces. Anticipating risks, aligning on requirements,
and designing the solution carefully are critical to success.
Another best practice recommendation is to use an agile methodology. Break up the full application
into smaller modules and establish frequent cycles of design, implementation, and testing, as
illustrated in the above diagram. This approach is considered a more successful alternative to the
classic waterfall approach.
The Process Suite goal is to enable solution construction with little or no custom coding. Use
templates to create processes, forms, and user interface screens. For planning, Process Suite
applications can require more time dedicated to requirements gathering and solution prototyping.
This chapter addresses business requirements, solution design, and prototyping. It offers suggestions
for planning and designing solutions that are directly relevant to EMC products.
Project preparation
To prepare the project, do the following:
• Read the documentation
• Define project roles
• Take training
• Create a project plan
Take Training
Training is important for project architects to understand the product architecture and current
features of EMC Documentum products. Training can be carried out in formal training classes, in
targeted meetings with engineering, or in EMC MyLearn web-based classes.
Include a Project Roadmap in the Project Plan that lays out the major phases of the project in the form
of a flow diagram and specifies the inputs and outputs of each phase. The inputs are templates that
define the minimum information that must be gathered to perform the phase. Include information
or participation needed from the client. The outputs are the deliverables produced by the phase.
Negotiate the level of participation by the client with the project sponsor in the planning phase.
Customization
Try to avoid customization. Following client specifications exactly can require costly and
time-consuming customization. It is more efficient to work within the functionality of the tools.
Customization prolongs the time to solution and creates a challenge with future upgrades. Process
Suite projects are high volume. If there are numerous customizations, the upgrade process can be
risky and expensive. The client may need to rebuild each application customization from scratch,
driving up the total cost of ownership and increases risk. There may be some cases when you need to
write custom code, but keep it to the absolute minimum.
Do not think of these areas as being independent from one another. For example, requirements for
BAM reports can suggest data fields to incorporate into the data model. In turn, these fields will be
entered into forms by task processors (or captured in automated activities). One corollary of this
observation is that the user interface architect must communicate frequently with the information
architect to ensure consistency and completeness.
The delivery team must understand the interdependence of the Process Definition, UI, BAM, and
Data Model so they can properly advise the client. Ideally, the project team creates templates to
facilitate the client’s understanding of the interdependencies. If the client wants to change the model,
then the client understands the corresponding impact on cost. This also helps the client to understand
the importance of the information the client provides to the delivery team.
The guiding principle of this phase is Iterative Design. This means that a solution cannot be designed
in a single step, but you need a series of workshops to reveal the design. It is not a linear path and the
design can change several times, which is relatively easy to do in the planning phase. After the data
model is finalized and development begins, fundamental changes can be disruptive. For example,
changing the field definitions in the data model causes forms to break (since forms link to data
through xpath). Complete the data model in the design phase and then lock it down.
After the client business team agrees with the structure and flow of the process, it can be transferred
to Process Builder to add the technical detail. As you refine the process in Process Builder, continue
to review the process flow with the client. One effective way to review the process flow is to use
the Process Debugger.
The Process Debugger allows you to execute the process one step at a time with real data. This can be
done even if the user interface has not yet been designed. This is an effective way to show the client,
while still in the design stage, how the process will work. In some cases, the client may be familiar
with how the process should work, while in other cases, the client may still be learning the process. In
general, the message is to use the Process Debugger early and use it often.
Work with the client to define a consistent approach to style. We recommend writing a Style Guide
that specifies standards for colors, skins, columns, labels, search criteria, sort order, and any items
pertaining to the UI. A detailed specification takes time to write, but it saves time during the
implementation phase of the project. Keep it simple and do not over-engineer.
Note: Design the Style Guide as a structured template to ensure that you are gathering the appropriate
information. Also use this template to show the client how things will be implemented and to track
any configuration changes. The result is to create a consistent look and feel for the application.
Responsibility: Identify processes that violate Service Level Agreements and take immediate
action.
Responsibility: Ensure that the process meets business metrics for revenue and quality.
The next step is to focus on each role and identify the report required. Write a reports summary
specification, which can be based on the following template:
Report Purpose
Report Details
Report Type
Describe the structure, information contained in the report and any drill-down behavior.
Audience
Role 1
Role 2…
How often will the data in this report be changed? How often should the report be refreshed?
Filters
Drilldowns
Is there a need to drill this report down? If so, on what field will the drill-down occur and what is
the drill-down report?
Avoid burdening the user with too much information in the dashboard. If more detail is needed,
then the report designer can create drilldowns.
For example, you have a process that is a new account opening. The executive user has requested
to see three things in the dashboard:
• The number of new accounts that were opened in each country (this week)
• The distribution of new accounts by customer type
• The average time to complete the new accounts opening process
The first report is a bar chart, the second report is a pie chart, and the third report is a dial-gauge. In
the client workshops, the BAM architect can present the following reports to the client in rough form:
These reports can be created in Excel, PowerPoint, or drawn by hand. The dashboard can be mocked
up by copying and pasting these reports.
The general approach is to do simple things first to ensure that everyone agrees on the report
prototypes. The report can be polished or enhanced at a later date. The goal is to confirm early and
then refine to avoid creating something that the client does not want. After the client is comfortable
with the rough dashboard prototype, progress to more polished reports that were created in Process
Reporting Services:
Over time, the number of reports will grow and drill-downs and multi-drill-downs will be defined.
You can create additional dashboard pages. Some reports can be moved from one page to another.
This process is the normal, iterative process of designing BAM dashboards.
Design Review
At the end of the design phase, the project manager can schedule a design review to validate the
design. This meeting can include architects and system experts who are not members of the solution
project in addition to EMC engineers to ensure the technical feasibility of the proposed solution. The
information gathered in the templates can be used as supporting materials to the solution design
presented to the client for signature. This process reinforces the design direction the team has taken
and justifies decisions made on the design. Following a successful design review with client sign-off,
the project can move into the implementation/development phase.
In creating the data model, consider how to set up roles and permissions and understand what type
of process data you must configure and monitor.
by manually creating the type and then adding individual attributes to it. For example, elements of
an appropriation request can be organized into the group Request with the attributes submitter_name,
submitter_address, and equipment_type. Within each SDT, you can also organize attributes into
related groups that give visual structure to the data type. For example, within the customer SDT, you
can have an address group that contains the attributes for city and state.
SDT definitions are global and can be used by any process. An SDT used by an installed process
cannot be deleted nor should any of its attributes be modified or deleted. However, it is possible
to add new attributes to an SDT that is in use.
Process variables store transient data; that is, data that is not needed after the workflow terminates.
Typically, this data is fetched from a non-Documentum system of record (another database, for
example) or is used for an internal calculation. The Process Engine manages the lifecycle of a process
instantiating the process variables when the workflow is started and destroying them when the
workflow terminates.
Creating an SDT for each object in your workflow can help to simplify your data model. However,
you should organize your SDTs in a logical manner, corresponding to business entities. If you have
too many SDTs, you may end up with performance problems with the TaskSpace task list. If that
occurs, then you need to consolidate the SDTs.
Understanding packages
A package is associated with a Documentum object, such as documents or image files, which is
passed between activities in an executing process.
To act on a document in any way, it must be attached to the process as a package or it must be held in
a case folder. When documents are held in a case folder, you can attach the folder as a package, but
you may want to consider attaching other key documents as separate packages. This enables users to
act on the documents separately, so that you can perform operations such as conducting conditional
routing based on the package metadata itself.
Package data is persistent. However, package data is not shown in TaskSpace task lists or task forms.
To enable package attributes to appear in TaskSpace, you first map the attributes to process variables
by using a Process Data Mapping activity template. After you map these attributes to process
variables, you can base decisions on the package attributes, such as transitioning to the next activity.
Conversely, process variables are not persistent and must be mapped so that you can base decisions
on the package attributes, such as routing the process to the next activity back to package attributes if
you want the data to persist beyond the life of the process.
When you change the name of an activity, it is equivalent to deleting it and creating an activity in
its place. All the associated information is logically deleted. Thus, it is a best practice not to change
activity names after the process has been deployed.
Process Builder is the tool used to design and implement the process. It enables you to create the
process by dragging and dropping activity templates. See the Process Builder User Guide for details
on product features and functions. This chapter describes several best practices for designing the
process by using Process Builder.
Designing activities
Consider the following when designing activities:
• Defining activity triggers, page 31
• Creating wait activities, page 31
• Sending tasks to a temporary set of users, page 32
• Invoking a secure web service, page 32
To export the certificate from the server on which web services is deployed:
1. Type the URL for the web services in a browser.
2. Export the certificate from the browser.
Note: Refer to the web browser help or documentation for instructions on how to export the
certificate.
Where
• <alias> is the unique name for the alias
• <certificate-location> is the location of the certificate
• <Process Builder or Method Server (JBoss) bundled java trust store> is the location of
Process Builder and the Method Server (JBoss) bundled Java trust store:
For Process Builder, it is: <drive>\PROGRA~1\Documentum\java\1.5.0_
12\jre\lib\security\cacerts
For the Method Server (JBoss) it is:
• %Documentum%\jboss4.2.0\jdk\jre\lib\security\cacerts (Windows)
• $DOCUMENTUM_SHARED$/jboss4.2.0/jdk/jre/lib/security/cacerts (UNIX/Linux)
3. Verify that the certificate imported successfully.
The system displays a message that the certificate was added to the keystore after importing the
certificate to the trust store. To list the content imported use the command:
keytool -v -list -alias <alias> -keystore <Process Builder or Method Server (jboss)
bundled java trust store> -storepass changeit
Designing processes
Consider the following when designing the process:
• Creating complex conditional routing with a decision split, page 33
• Using sub-processes, page 35
• Understanding message correlation, page 35
the VP of Marketing must review and approve it. If it is an engineering document, then the VP of
Engineering must review and approve it. In this example, the process diagram looks like this:
However, you may have a requirement that is more complex. Similar to the previous example,
you may want the VP of Marketing to review marketing documents or the VP of Engineering to
review engineering documents, but what do you do with a document that is neither marketing or
engineering? In this example, a document that does not fall into one of those two categories must be
reviewed and approved by both VPs. This means that there are cases where only one task is triggered
(going to one of the VPs), whereas in other cases both tasks are triggered (going to both of the VPs).
To model this complex routing, you need to add a Join activity. Even though it is common to create a
Join after a decision split, this example uses the Join to consider the two out of three case. In this
example, the process diagram looks like this:
The use of Join in the first two conditions ensures that each option leads to two activities being
selected. The trigger option at the Join activity looks like this:
Using sub-processes
Process Builder enables you to create in-line sub-processes. Using sub-processes can improve your
ability to communicate the structure and the business meaning of a process template. A large
or complicated process can become difficult to organize visually when there are many activities
required to complete an entire workflow. To simplify the layout of a process, group related activities
into sub-processes that collectively represent a business process.
In Process Builder, sub-processes can be expanded to view the individual activities or collapsed to
create a more simplified overview of a process. The process contains activities that are related in some
way and are grouped into a container for ease of administration. This can be useful when grouping a
set of activities that collectively represent a business function or a logical step in a process. Activities
that share the same process data can also be grouped into a sub-process.
It is also possible for one process to invoke another process. In this case the invoking process is called
the parent and the invoked process is called the child. If you want to build a process that invokes one
or more child processes, use the Invoke Process activity template. If you need the child process to
post events to the parent process, then use the Post Event to Parent Process.
to enable correlation in the activities of that process. You can use one or more process variables to
create a correlation set to uniquely identify the process instance.
Use the data mapper’s copy function to compare one of the attributes of your incoming data to the
value of one of your process variables belonging to a correlation set.
Note: The copy function is used to compare these values. If there is a match between the value of
the process variable (belonging to a correlation set) and the value of a data attribute of the incoming
message, the match is successful and the step activity is completed.
5. Click the handle on the left side of the copy function to complete the mapping.
As you can see, Var 0 on the left, has been mapped to Var 0 and to Var 1 on the right.
4. Double-click the Split function that is connecting the left hand side to the right hand side.
5. Click the plus sign (+) and add a period (.).
6. Click the plus sign (+) and add 1.
7. Use the up/down arrows to reorder them as follows:
In general, you can use this approach whenever you are working with repeating elements and you
wish to take action for each one of the elements.
When you are mapping repeating nodes, you want to retrieve all the values of the repeating node
from the left-hand side and then insert these values in the proper position on the right-hand side. To
achieve this, multi-valued attributes have an Add link next to their names. Use the Add link to create
a node that represents a specific index of multi-valued attributes.
LAST creates the attribute at the end of any other existing attributes. The input values are
appended after any other values. This ensures that mapped data does not overwrite any existing
value.
Looping
Many applications must perform a series of activities several times in succession. Process Builder
enables you to define loops by using a counter mechanism. The following process shows how to set
up looping within a process. In the first activity, a human performer specifies a set of departments.
Then, an automated loop cycles through each department, invoking a process for each department.
necessary because the index of a multi-valued attribute starts at zero, so if you have three values,
the index must decrement from 2 to 0, rather than 3 to 1.
4. Add an automatic activity as a decremental decision transition to begin the loop.
The activity executes a no (Begin Loop) operation (dm_noop_auto_method). The trigger is
configured to create a task when either of the two inputs is followed. Configure it to trigger
for 1 out of 2 possible input flows.
5. Next, include an Invoke Process activity template to map the data into the child process.
In the data mapper, use the Get Value function to set the value of the process variable department
on the new process instance.
6. Create a decision split activity called End Loop. In this activity, there is a transition condition that
returns to the loop if the counter is greater than zero. If the counter is zero, then it will exit. The
Cleanup activity (also a no op) is needed to connect to the end of the process.
3. On the Timers tab, set a timeout interval for the Wait step.
In our example, we set the post timer to expire one minute after the task is created. (This is the
polling interval. We assume that the child processes are short and will complete within one
minute.) When the timer expires, the task will complete.
4. On the Trigger tab, set the trigger condition of the Wait activity to 1 out of 2 possible input flows,
since it can be triggered in two mutually exclusive ways.
After the timer expires and the task is completed, the process proceeds to the Count Children activity.
3. On the Transition tab, configure which activity is next in the process and what the conditions are.
The following screenshot illustrates using a count function to count the number of
r_object_ids returned from the query, and we assign this number to the process variable
runningChildrenCount. When runningChildrenCount is equal to zero, the process proceeds to
the Cleanup activity. When runningChildrenCount is greater than zero, then there are still
running instances of the child process and the process returns to the Wait activity.
Inter-process communication
Process Builder includes an Invoke Process activity template that enables a parent process to invoke a
child process. An example of this is if you have an order process that must perform a credit check.
Instead of building the credit check procedure into the order process, create it as a separate process to
be invoked as a child process. You can manage the credit check process separately and independently,
and it can be called by other parent processes.
The Invoke Process activity template provides a basic mechanism to pass process data from the parent
process to the child process. By using the Post Event to Parent activity template, the child process
communicates to the parent process by posting events to the parent process instance. However, the
posted event is only a name that carries no payload to the parent. To pass a data payload from the
child to the parent, use the following approach:
In this example, the parent process invokes the child process in the third activity:
The Invoke Process activity passes a workflow correlation identifier to the child. The child process
needs this identifier to call the correct parent process instance. Map the correlation identifier of the
parent to the variable parentInstanceCorrId of the child, as illustrated:
After invoking the child process, the parent process instance waits for a response message with the
payload from the child in the HTTP Inbound - Step activity. The following shows the configuration
for the HTTP Inbound - Step activity:
In the HTTP Inbound - Step activity, the parent sets the Correlation Property Name to CorrId. When
the child process instance posts an HTTP request, the system looks for a workflow instance with a
correlation identifier that matches CorrId in the URL Parameter of the request.
When configuring the HTTP Inbound - Step activity, select With Attachments. The attachment carries
the payload from the child process to the parent process.
To enable the payload to pass from the child process to the parent process, map the HTTP attachment
data posted by the child to the attachment data field of the parent, as illustrated:
After the parent process invokes the child process, the child process carries out the request of the
parent. The child process then posts a response message back to the parent instance in the HTTP
Outbound activity. The parent process instance waits for the message with the payload from the
child and receives it in the HTTP Inbound - Step activity.
Configure the HTTP Outbound activity as illustrated:
For the child process instance to post the message to the correct parent process instance, it uses the
parentInstanceCorrId. The child process receives this identifier from the parent process when the
parent invoked the child. This variable is mapped to the URL parameter CorrId as illustrated:
Attachments pass from the child process instance to the parent process instance by mapping them to
the HTTP Request attachments.
This approach was based on HTTP Post. An alternate approach is to use JMS as the protocol for
inter-process communications. This approach is useful when the child process must pass data to the
parent process.
at or right after the service invocation step. As a recommended best practice, as you build up your
processes, particularly with web services integration activities, regularly test them with the debugger.
Use the debugger every time you change the process to ensure the integrity of data that may have
changed.
3. In your integrated development environment (IDE), add a new debug configuration to connect
the host and port.
4. Place breakpoints in your code, run this debug configuration from your IDE, and begin
debugging your process from Process Builder.
Whenever an activity that uses your custom method is reached, you can debug through your
custom method code.
Note: To make the method code available to the process debugger, it must be a Class that is
located in the appropriate directory structure or a JAR file located under \classes\custom folder
in the Process Builder installation directory.
You must activate the audit trail for each process that you want BAM to monitor. Activating the audit
trail enables the BAM server to extract the reporting data and insert it into the BAM database.
In the Process Template Properties dialog box, select the General tab. In the Audit Trail Settings
group box, select the On option.
When auditing is on, audit trail information is saved for each workflow instance created from this
template.
Note: To change audit trail settings, you must have CONFIG_AUDIT privileges. CONFIG_AUDIT is
an extended user privilege and must be granted by another repository owner or superuser using
Documentum TaskSpace or Documentum Webtop. The Documentum Webtop User Guide and the
Documentum TaskSpace Configuration Guide provide more details on extending user privileges.
You must specify the activities in which you want to record process data to BAM. It is not necessary
(or desirable) to do this for every activity. Select the SDT attributes to include in reporting from
each activity.
Use the Add Structured Data Type Wizard to create structured data types. To record a data attribute
in the BAM database, select the Reportable checkbox in the wizard. Make sure to update the data
type definitions with BAM by selecting Update BAM Database tables based on this SDT definition.
This procedure creates the tables in the BAM database, which are used to report on business data.
Examples of such reports include total weekly revenue per branch office, average duration of claim
resolution by type, and number of new grants applications by state.
Report data can come from packages as well as from SDTs. Select the packages to include in reporting
at the process level or in the individual activity. To enable Process Builder to publish reporting data
to the BAM database for the package, select This package can be used to generate reports in the
Process Properties dialog box or in the Data tab of the Activity Inspector. When you do this, all the
custom attributes of the object are available to the BAM database. Unlike the case with SDTs, you
cannot select individual attributes for reporting.
You must specify the activities in which you want to record process data to BAM. It is not necessary (or
desirable) to do this for every activity. Select the packages to include in reporting from each activity.
You must specify the activities in which you want to record process data to BAM. It is not necessary (or
desirable) to do this for every activity. Select the variables to include in reporting from each activity.
Process variables can be simple types (Boolean or string) or can be structured data types that were
selected for reporting when they were created. To expose this variable and use it to generate reports,
select the This variable can be used to generate reports checkbox in the Data tab of the activity.
If you change the process data in a process, there is the danger that these changes are not reflected in
the BAM database structure. Ensure that any changes to the business data that you are monitoring
have been updated in the BAM database. The Update BAM Data Definitions page enables you to
update selected process data from Process Builder with the existing reporting data in the BAM
database. This ensures that there is consistency between the structure of the data in Process Builder
and the structure in BAM. This option updates the BAM process data with the process data that you
select in the Update BAM Data Definitions page.
Select Tools > Update BAM Data Definitions to use the Update BAM Data Definitions page.
Deployment considerations
Plan for the following when designing the process:
• Migrating or upgrading Process Builder, page 60
• Process versioning, page 60
Process versioning
Use versioning carefully and sparingly to avoid creating inconsistencies in the process data. It is best
to create versions when deploying a process to the production environment. This is because creating
different versions of a process can result in process data getting out of sync. Use versioning carefully
and only when the process is moved to a production environment.
During design, it is a best practice to use the process debugger frequently to troubleshoot and
validate the process without installing it. When the activities and the data model are defined and
the debugger is run without errors, then check the process into the repository. Install the process
only when you want to run it.
Note: It is not recommended to use Save As on a process multiple times and then install each of the
cloned processes. This practice can lead to instabilities at run time.
Building the user interface for a TCM application involves creating form templates in Forms Builder
and implementing the application in TaskSpace.
• Make a prototype and get it fully functional to help ensure that the UI design is as user-friendly as
possible and that the client approves of the design before doing the actual development. Keep in
mind that the size and complexity of the prototype is not necessarily indicative of the development
effort. The development time will be greater due to coding complexity and customization. You
also cannot necessarily predict performance issues from the prototype.
• Consider how many task templates you need. You may want to make one template for each step or
activity in the process, and they should all have a common look. This also improves performance.
• When initially designing templates, keep in mind that it can sometimes be easier and quicker to
delete a template and start over again.
• Sometimes what the client wants is not a good design. For example, even though you can search
on 20 attributes does not mean that you should.
• Try to design forms so there is no scrolling for the end user. This may mean using more tabs in
TaskSpace.
• After creating a template with the template wizard, if you start changing and adding controls
and moving items around on the canvas, especially if you use drag and drop, frequently verify
how things look on the Preview pane. When using drag and drop, items can appear to line
up using the vertical bar, but the spacing and padding appear differently on the Design pane
than on the Preview pane or at runtime. To ensure correct alignment, especially for labels, use
the settings on the Style tab.
The most straightforward way to start a process is to go to the Processes tab. You can select one of the
processes that are listed there and click Start Process.
You can use a search template that is configured to search on processes in a specific folder. This is
useful if you want to filter the list of processes that a user sees.
On a search tab in TaskSpace, you select a process from the search results set and then select Start
from the context menu.
To make things simpler and avoid selecting from a list of processes, you can start a specific process
using a configurable action on a custom tab:
1. Create a configurable action to start a process.
2. Select the specific process to start.
3. Add the action to a custom tab.
If you first receive a document and then want to start a process with the document as an attachment,
you can start a specific process with an attachment using a configurable action:
1. Create a configurable action to start a process with a selected object.
2. Configure how the object is routed by selecting Dynamic Object Value Selection.
5. Start the process from a search by selecting a document and then clicking the configured action
button. The process starts with the selected document added as an attachment.
The standard option when a process starts is to open an initiate process template. Each template is
associated with a specific process and you can show process variables and use a package control to
allow users to upload or select an existing document to be the package content. You can also further
customize the template by adding additional controls.
If you want a specific form as a package in the process, which appears when the process starts,
you can use an electronic form in a package:
1. Create an electronic form and specify it with a Start Activity.
2. Open the Activity Inspector for the Start Activity in Process Builder, specify the electronic form
with a package, and select to automatically launch the package.
3. At runtime, when the process starts, the configured electronic form in the package appears
automatically. This is useful if you need to fill out a specific form for the associated process.
4. When the user submits the form, if any of the mandatory packages are not fulfilled, the user is
prompted to fulfill the mandatory packages.
You can display a specific document package inline using the task template with an embedded
form control.
When selecting the data binding for the embedded form, be sure to bind to the valid object path:
/Activity/.../<package name>/DocumentId (you can also bind to a process variable assuming that the
variable value is an object ID).
Next, select the appropriate form template to show the package content, which is a document view
template or an electronic form. You can then preview a document and modify document data while
processing the task. It is possible to have multiple embedded forms in one view, but the UI can be
busy when there are multiple packages.
You can display a specific folder package inline using the task template with a folder view control.
Select a folder package for the data binding, then configure how the folder view is displayed. This
option allows you to treat a folder as a case and preview a document while processing a task, but you
must show less information due to space limitations.
You can display multiple packages inline using the task template with a folder view control and
an attachment list control. This option allows you to accommodate multiple types of packages
(documents and folders) but you are limited by available space. The folder view can display a folder
object from the attachment list. You can use the preview pane to view an object from the folder view
or a document from the attachment list. When viewing a document from the attachment list, the
folder view is empty.
Figure 9. Task template with folder view and attachment list controls – folder information displayed
Figure 10. Task Template with folder view and attachment list – document displayed
process correctly aligned. The recommended practice is to always create a new version of the process
and then create a new version of the template. When you create a new version of one of these
templates without also creating a new version of the process, the old version of the template becomes
orphaned. When a template is orphaned, it cannot be used until it is linked back to a process. In the
following graphic, if you created form template version 1.1 before creating process template 1.1,
then form template 1.0 would become orphaned.
When a form template associated with a process is checked out and a new version created, Forms
Builder automatically links the process to the latest version of the template. If you delete the most
recent form template version, however, the version link breaks and the process is no longer associated
with a form template. You also cannot delete a task, initiate process, or process parameter template if
the template and its associated process are both non-current versions. However, if either the process
or the associated form template are the current version, you can delete the form template.
enter. A good example would be: Enter a valid date using the format MM/DD/YYYY.
Another technique that is helpful to users is to provide default values for controls that use specific
masks, such as dates, social security numbers, and phone numbers, so that the user has an example to
follow before entering a new value.
If a control does not have a custom error message defined in the template and the value entered is
not valid, the system displays a default error message. The default error messages are generic and
only provide guidance as to the data constraints for the control. An example of a default message
for an invalid data type is: Value entered is not valid for the integer data type.
An example of a default message for an invalid value for a specific data constraint is: Enter a
valid value equal to or greater than %%" (where %% = the defined minimum
value). While these messages may be helpful to the user, they are not as helpful as messages created
specifically for a template, tailored to the data requirement for that template.
Data adaptors
Some general guidelines for using adaptors are:
• Do not use too many different adaptors as it can impede performance. You can use one data
source adapter to populate multiple fields at once by populating off one database table with
multiple columns. This way, the adaptor only runs once.
• Keep adaptors simple and lightweight and keep the execution time of data source adaptors
as short as possible.
• Do not perform high latency calls within data source adaptors, such as long-running queries or
slow web service invocations, as these calls can greatly impede performance of the UI.
• If you plan to use more than one adaptor, concentrate on getting one adaptor to work before
creating another.
• When using a data source adaptor, make sure that the input is required and that the output is
repeating.
• When creating a custom data source adaptor, you have a choice of creating it as a class or
service-based business object (SBO). If you create it as a class first it is much easier to validate and
debug the adaptor. You can then convert the adaptor to an SBO if desired.
The following table provides information on which templates or controls each adapter can be used
with, guidelines for how best to use each type of adaptor, and important limitations.
Item Not available on high Use this adaptor if you want to validate a user’s input
Validator fidelity templates values on individual fields. An item validator applies to
one field and occurs when the user submits the template.
Available on these
controls: Text Field, For example, if a data item is an email address, the item
RichText, Number validator could check that it contains exactly one @, at least
Field, Date Field, one dot after the @, and a recognized domain after the
DateTime Field, final dot.
CheckBox, CheckBox
Group, Radio Button If an error occurs, an error indicator appears next to the
Group, ListBox, control and the full error message appears at the bottom
Dropdown List, of the template. You can create custom error messages
Comment history, (recommended), or if there is no custom message a default
Readonly Table, Slider, error message appears. However, if there is a hidden field
Embedded Form, and on the form that is also marked as required (this should be
Filter avoided), the error message still appears at the bottom. You
can create a custom error message in a validator adaptor
by throwing an exception in the validator adaptor code.
If you want to automatically populate multiple fields on a template, creating one data source adaptor
that populates multiple fields greatly improves performance. The following example takes you
through creating two data source adaptors. The first adaptor populates a list box of employee names
from the repository. When a name is selected, the second adaptor then automatically populates ID,
E-mail Address, Phone Number, and Department fields for the selected employee from an external
database.
The first data source adaptor populates the Name list box with employee names from the repository.
1. Open the Adaptor Configuration Manager and add a new adaptor with the name
GetAllEmployeeNames.
2. Select Data Source as the adaptor type.
3. Select com.documentum.xforms.engine.adaptor.datasource.docbase.
IDocbaseDataSourceAdaptorService as the class name and SBO as the type.
4. Enter dql for the initial parameter and enter the appropriate DQL clause.
5. Specify the output type to describe the return value. Note that the columns returned from the
DQL query conform to the items in the output schema.
6. Click OK to save your configuration and close the Adaptor Configuration Manager.
7. On the Design panel, select the Name drop-down list and open the Properties > Data & Storage
tab.
8. In the External Data Source field, select the GetAllEmployeeNames adaptor.
9. Configure the data source output by specifying the Row Selection, which is the repeating
element that is the parent element of the elements you will select in the Value and Display
fields (/data/item). In the Value field, select the element that provides the data value
(/data/item/employeeid). In the Display field, select the element that is what the users will
see (/data/item/name).
You have finished configuring the data source adaptor to populate the Name drop-down list.
The second data source adaptor populates ID, E-mail Address, Phone Number, and Department
fields for the employee selected in the Name field.
1. Open the Adaptor Configuration Manager and add a new adaptor with the name
GetEmployeeInformation.
2. Select Data Source as the adaptor type.
3. Select com.documentum.xforms.engine.adaptor.datasource.jdbc.JDBCDataSourceAdaptor as
the class name and Class as the type.
4. Enter sql for the initial parameter and enter the appropriate SQL clause for pulling information
from the data base. In the SQL clause, you specify the parameter with ${<param name>} (if
using dql, use single quote, e.g., id=‘${id}’).
5. Define the substitute parameter in the Input field. Be sure to select the Required option for the
input field name. The parameter must match the input name.
6. Specify the output type to describe the return value. Note that the columns returned from the
SQL query conform to the items in the output schema. This should include all the employee
information fields you want to populate.
7. Click OK to save your configuration and close the Adaptor Configuration Manager.
8. On the Design panel, select the ID field and open the Properties > Special tab.
9. Select the option to Execute adaptor after the input value changes. This triggers the adaptor to
fire after selecting an employee in the Name field.
10. In the Data Source field, select the GetEmployeeInformation adaptor.
11. In the input binding id field, select the xpath where the input value is stored (/employee_id).
When you specify an input value, the adaptor returns database table values associated with that
input value. In the Output data field, select the xpath where the ouput is stored (for the ID
field, this would be /data/item/employee_id). This indicates the value in the database table that
populates the field.
12. Repeat the configuration on the Special tab for the other fields you want to populate from
this adaptor. The settings are the same for each, except that the output data should match
the field being populated. For example, for the E-mail Address field, you would select
/data/item/email_address in the Output data field.
13. Add the properties file for this JDBC adaptor to the <web app>\WEB-INF\classes directory.
The properties file name corresponds to the adaptor name and in this case would be
jdbc_GetEmployeeInformation.properties. The properties file specifies the location of the data
source and standard JDBC configuration parameters, such as login credentials for the data source
and the path to the JDBC driver. The properties file for this adaptor would be similar to:
url=jdbc:mysql://localhost:3306/mysql
driver=com.mysql.jdbc.Driver
user=root
password=forms (or password=pT9oeWTVuFI=)
To encrypt the password, you can use the DFC encryption method:
java com.documentum.fc.tools.RegistryPasswordUtils <password>
You have finished configuring the data source adaptor to populate the employee information fields.
Performance
When working with form templates, the following tips can improve performance:
• From the beginning of the project through the design, maintain focus on performance.
Performance takes precedence over appearance and special effects.
• Reducing complexity in form templates aids TaskSpace performance: fewer fields and fewer
adaptors.
• On a task list template, keep the search simple as this greatly aids performance.
• While using multiple task templates for different activities may improve performance, it also
increases maintenance. A better approach may be to use one task template with conditional
display settings.
When you create a task template, Forms Builder automatically adds the appropriate buttons based on
whether there are one or more forward tasks, if the forwarding is conditional or manual, or if there
are reject paths. However, if you add or remove process flows when modifying the process, these
buttons remain static and therefore do not necessarily reflect the new process flows. When you test
the application you may find that you are not able to finish or forward a task.
The easiest solution is to add ALL task buttons that you think you may need. TaskSpace automatically
shows and hides buttons as appropriate. This can save you much time during development. As a
good practice, name the buttons with the same name as the actual activity (for example, if it is a
Reject button, label it Reject). You can give it a custom display label, but the name should reflect
the actual activity being performed.
If you change the underlying data model (such as a structured data type definition), update the task
template itself to reflect this change, otherwise the Finish button may fail to work.
1. Uninstall the task template.
2. Save the task template.
3. Install the task template again to refresh the template’s data model.
When configuring current object actions (that is, selecting Current Folder, Current Calendar, or
Current Object as opposed to Selected Object) be sure to select the Always option for the Enable
action if setting.
A current object action is invoked against an object you are currently viewing (in the Open Item tab)
such as a document, folder or task. Always indicates that the required action parameters are supplied
by the object being viewed and that the action can be evaluated when the action control is rendered.
When configuring selected object actions (that is, selecting Selected Object as opposed to Current
Folder, Current Calendar, or Current Object), be sure to select the One object selected\More than
one object... option for the Enable action if setting.
A selected object action is invoked against an object you have selected in the data grid. One object
selected\More than one object... indicates that the required action parameters are not present until
objects are selected in the data grid and that the action should not be evaluated until then.
Do not assign current object actions for use in configured menus. It is confusing to the user as to
when menu items should be enabled or disabled:
• If an object is selected in a folder content data grid, the action is enabled on the menu.
• If an object is selected in a search view data grid, the action option is disabled on the menu.
• Because the folder view provides proper current object context, the action can be enabled in
folder view but not in search view.
This section contains information that partners and field engineers can use to better implement
Business Activity Monitor (BAM).
System requirements
It is a best practice to use the BAM Sizing Calculator to determine your hardware requirements before
you install Business Activity Monitor. The sizing calculator is available on the EMC Download Center
in the Documentum Business Activity Monitor Supplemental Files zip file. This tool calculates the size
of the BAM database, the BAM server, and the Taskspace/BAM dashboard server, based on several
metrics that you enter, including the number of processes you are monitoring, the number of process
instances monitored per day, and the average number of activities in each process.
Reporting requirements
Planning is the most important step in deploying any component of the Process Suite, including the
Business Activity Monitor. If you are deploying BAM with Process Builder, Forms Builder, and
TaskSpace, it is a best practice to define your reporting requirement as one of your first steps in
deploying the Process Suite. This requires that you brainstorm and design mock-ups of the BAM
reports and dashboards that your business requires, even before you begin to design your process.
Your reporting requirements can have a large impact on how your process is designed and the
substance and structure of your data model, including SDTs and package attributes.
The more detail included in your requirements specifications, the better. At a minimum, reporting
requirements should include:
• purpose of report
• report type (management report, operational report)
• report columns
• a description of drill-down relationships (both single and multi-drill-down reports)
• audience
• frequency and timing
• report filters and dashboard filters (both default and initial filters)
• graphical representation
Providing a sample of the report helps identify the attributes that must be monitored in the process.
The easiest way to design mock-ups is with Microsoft Excel where reports can be formatted
numerous ways (tables, pie charts, bar charts, and so on). Although a fairly simple example, the
following mock-up highlights that vendor and amount attributes must be monitored within the invoice
process. In addition, since this report calculates an average amount, we know that aggregation is also
required. This mock-up provides important information that impacts the design of your process.
Identifying reports that rely on aggregation is an important part of defining report requirements.
BAM provides three methods of aggregating report data. First, there is report aggregation which
is based on instance-level data. Then, there is server aggregation where data is automatically
aggregated for nine different time intervals (5 minutes, daily, and so on). And finally, there is custom
aggregation. The Business Activity Monitor Implementation Guide has more information on each type of
aggregation. In terms of planning, anticipating the type of aggregation you require is helpful. For
example, if you have high-volume processes where thousands of instances are running each day, then
report aggregation is not recommended. Attempting to aggregate large volumes of instance data will
severely compromise BAM server performance.
amounts of detail. Users can navigate, or drill-down, from one report to another based on their
needs. Multi-drill-down reports update the contents in surrounding target reports based on a
users’ selection in a base report. Single and multi-drill-down reports are addressed in the Business
Activity Monitor Implementation Guide.
Drill-down reports are well-suited in situations where the purpose of the dashboard is to identify
root causes of process problems. Users must be able to move from one level of detail to another,
while attempting to isolate the process instances that are problematic. Dashboards configured
in this way should be provided to users that also have the authority to change the process, if
necessary.
2. Dashboard users — It is best practice to always consider the characteristics of your dashboard
users as you design reports and begin to think through the contents of each dashboard.
Individual users are assigned to one or more roles and dashboards are assigned to roles. That is
the extent of the security, so when a dashboard is assigned to a role, all users associated with that
role can view the contents of the dashboard. It is important to compile a complete list of users
and roles so you do not inadvertently assign dashboards to users that do not require them. You
may find that the roles available in the repository are too broad. In this case, you may need to
create separate dashboard roles. Another method for controlling access is to use filter variables
that limit the data displayed in dashboard reports to that owned by a specific dashboard user.
3. Number of dashlets for each dashboard — The optimum number of dashlets contained in a
single dashboard is determined by the resolution of the users’ monitors. If monitor resolution
is 1440 x 900, then no more than four dashlets should be included in a dashboard. If monitor
resolution is set to 1920 x 1200, then up to six dashlets can be placed on a dashboard. It is best
practice, then, to know the capabilities of users’ hardware, and to plan for the lowest common
denominator. If 50 users have the higher monitor resolution, and 10 have the lower resolution,
then plan for dashboard to contain four dashlets.
There are a few other points to consider:
• Dashlets containing Crystal Reports require more space so take this into consideration when
you are planning dashboards
• Small dashlets can be maximized
selection of chart types, and a richer syntax for writing computed column formulas. If you require a
high degree of control over the look and feel of a report, then Crystal Reports is the best option.
Custom aggregation
Your need to define custom aggregation report entities relates to the attributes you have selected to
monitor. As a practical matter, do not collect data unless you absolutely must. If you need process
data only once, then collect it at the end. If you need data more than once, for multiple activities in
a process, then you may have problems with business data aggregation. For example, if a numeric
attribute value during Activity A is 3 and the same attribute value for Activity D is 9, then the
calculated average is 6, which is inaccurate. In reality, the value that should be incorporated into the
calculation is 9, not 6. In these cases, custom aggregation report entities must be created. Custom
aggregation must also be used when you want to combine data from multiple data sources and in
situations when you need to combine process data and business data. For example, if you want to
calculate the average duration it takes to process orders from the state of California, then custom
aggregation must be used. Custom aggregation can also be used to improve BAM server performance
when report aggregation attempts to collapse thousands and thousands of instance-level data.
For more on custom aggregation, please see the Creating Custom Aggregation, Report, and Filter Entities
chapter of the Business Activity Monitor Implementation Guide.
Preconfigured Dashboard reports is to open them for editing in PRS, and add and/or remove data as
appropriate. For instance, to add SDT or package attribute data to the List of Process Instance report,
since business data is not included in any of the Preconfigured Dashboard reports.
Caution: If you modify any of the Preconfigured Dashboard reports, you must understand how
to reformat the chart data. Adding and subtracting data disturbs the X-axis and Y-axis settings.
Another approach to leveraging the Preconfigured Dashboards is to design a report and replace one
of the dashlets in either of the three dashboards. With this approach you do not need to design a
dashboard from scratch, you are simply replacing the contents of a dashboard that already exists.
You are not required to create a dashboard tab, or assign the tab to roles, as is typically required when
you design a dashboard. Keep in mind, though, that editing a dashboard changes the contents
for users that are viewing it.
Performance and scalability are challenging issues in the design of any system, and especially so for
large, distributed process-based applications. This chapter provides configuration guidelines and
suggestions to build scalable process suite solutions and is divided into the following topics:
• General approach to performance, page 97
• System configuration guidelines, page 98
• Factors that affect performance and scalability, page 100
• Recommended environment settings, page 102
• Tuning and troubleshooting performance problems, page 105
In general, the recommended approach to performance testing is to carry out two classes of tests:
• Single user profiling — In single-user profiling you run unit tests, such as collecting DMCL/DFC
traces.
• Load testing — In load testing you are looking for bottlenecks and capacity issues.
Small configurations
The recommended approach to host small configuration environments is to use VMWare. Create a
separate virtual machine for every tier. Each tier should have roughly 2 to 4 Central Processing Units
(CPUs) with 4 GB to 8 GB of Random Access Memory (RAM). A VMWare configuration could include:
• TaskSpace application server
• Content server
• Database server
• BAM application server
• BAM server
• BAM database server
Medium configurations
The recommended approach for medium configuration environments is to use enterprise machines
that do not include clustering. Large servers ranging from 4 CPUs to 16 CPUs and 8 GB to 32 GB of
RAM are used for each of the following tiers:
• TaskSpace application server
• Content server
• Database server
• BAM application server
• BAM server
• BAM database server
• BOCS server at branch offices
If required, this type of configuration supports two content server instances on a single machine and
multiple content servers implemented on multiple machines.
Large configurations
Enterprise machines are used to host large configuration environments. Unlike medium
configurations, large configurations include hardware and software clustering at the application
server and database server levels. Large servers with machines ranging from 16 CPUs to 64 CPUs and
32 GB to 128 GB of RAM are used for each of the following tiers:
• TaskSpace application server
• Content server
• Database server
• BAM application server
• BAM server
• BAM database server
• BOCS server at branch offices
• The network must not experience performance issues and it must have minimal latency.
Documentum products are network sensitive and require a high speed network.
• All tiers of the server environment must be co-located in the same data center.
Consider system sizing during the design phase, since volumetric considerations influence the system
design. For example, some process transactions like the Task List, are high yield, which means that
they are performed frequently. A transaction like that can perform well in a test environment, but
can cause performance problems when you have many users making simultaneous transactions.
Each transaction queries the database that must return results in under one second. Aggregate
demand on the database adversely impacts user performance when a browser is unable to render
screens in a timely fashion. It can take five seconds or longer in some cases. If there are thousands of
simultaneous users querying the database, there may not be enough bandwidth to accommodate the
demand and performance continues to degrade.
Search forms
A client may ask you to implement a more complex search option in TaskSpace. Even though it can
be easy to configure multiple search options, advise the client against doing it. The difference in
database performance (and maintenance) between three search options and four is enormous. The
best performing applications enable the user to complete their jobs with a minimum of actions and
choices. Designing search forms that are seldom or never used is not advised.
Design search forms with as few search criteria and columns as possible. This keeps application
performance degradation to a minimum and makes any future scaling of the application easier.
Each search criterion requires more maintenance by the database and adds load to the system. For
instance, just one poorly constructed search form with many search options and columns can
render the application non-operational. Before designing search forms, it is important that you fully
understand the business use cases.
100 EMC Documentum Building TCM Solutions Version Best Practices Guide
Performance and Scalability
Wildcard searches
Avoid implementing wildcard searches because exact searches are the only searches that scale.
All other types of searches involve some form of scanning (index or table), which can hinder the
scalability of the database, and ultimately the entire application.
It is important to separate what a client needs from what a client wants. The needs of the business
should determine how search forms should be designed and used. System performance and future
scalability become victim to over-engineered search forms. Articulate to the client the impact of
search technology on system performance. In most cases, clients are willing to reconsider a feature
that negatively impacts system performance.
Task lists
Task lists are one of the most frequently used views in TaskSpace. The best performing task lists limit
the number of SDTs and process variables that appear within the task list. Each additional SDT
and process variable results in a new query to the database. When several concurrent users are
working on the system, these queries add up. As a practical matter, you can consolidate one or more
process variables into a single SDT. For example, a task list containing nine process variables takes 10
seconds to 15 seconds to populate the window. The same task list built with one SDT containing
nine attributes takes only 3 seconds to 5 seconds. The difference is that the first task list issued
nine requests to the database while the second task list issued only one request. (The best way to
understand this behavior is to take a single click trace of the transaction).
Consider the number of filters that you create in a task list because each filter impacts system
performance.
Pre-conditions
Be careful when using preconditions in a TaskSpace application because they inject row-by-row
processing into the response time. If there are ten tasks in a task list and there is a precondition
defined for the object type, then the precondition fires ten times. Although preconditions provide
rich functionality, they can negatively affect system performance. For example, a task queue with
preconditions took 8 seconds to 10 seconds to render. Without the preconditions, the same task
list rendered in 3 seconds.
It is recommended that you take a baseline timing of a task or search form before you add
preconditions. After you establish the baseline, the incremental performance cost of a feature can be
assessed. Associate a response time or resource cost for each feature request. Then it is possible to
calculate the cost and benefit of each feature. Calculate a cost, even if a feature is mandatory. Discuss
feature costs together with performance and scalability with the client.
EMC Documentum Building TCM Solutions Version Best Practices Guide 101
Performance and Scalability
Skill-set matching
Skill-set matching is another feature that, if not used wisely, can lead to system performance issues.
There are two reasons for system performance issues.
First, skill-set matching involves row-by-row processing where every task that returns in a task list is
evaluated against the user’s competency. This is a drain on system resources.
Second, skill-set matching does not use task list database query language (DQL) optimizations.
Without skill-set matching, a task list brings back the first 100 to 300 tasks. This means that the data
list generated requires less resource consumption and performs faster. To bring back the tasks to the
user when the system matches user skills against tasks, it requires the whole data list.
This means that all tasks are sent to the application server for evaluation against the skill-set. If a task
list contains 10,000 tasks, it brings back 10,000 tasks. Even if the user matches one task out of 10,000
the system still brings back 10,000 tasks, which takes time. A task list with 10,000 tasks that does
not match skill-sets brings back 100 to 300 tasks depending on the number of SDTs involved. It is a
best practice to calculate a baseline timing of a task list before implementing skill-sets. This baseline
calculation enables you to determine how skill-set matching impacts performance. It is important
to realize that performance costs grow as the number of tasks grow in a task list. For example, a
task list that returns 100,000 tasks is unusable.
Logins
The user login operation takes 2 seconds to 10 seconds to complete. The variable that most influences
the login time is the landing tab that opens after a user logs in. If the client prefers a fast login, then set
the landing page to the default page or to a blank search page. However, if the user has a preference
about the tab, then logging in can take longer.
102 EMC Documentum Building TCM Solutions Version Best Practices Guide
Performance and Scalability
JVM Capacity
Heap size and threads control the number of users the JVM can support. A JVM can accommodate
150 to 300 users per instance. Response times degrade as more users enter the JVM. The degradation
is due to users waiting for available threads. EMC recommends a 1024 MB heap. A heap size smaller
than 512 GB results in an out-of-memory error. If the permanent generation is not increased to
a minimum of 128 MB, it could result in an out-of-memory error. The CPU is the first resource to
bottleneck on this tier. The application server is the most scalable tier in the system because there
is no limit to the number of JVM or servers that can be added.
Additional tips
EMC Documentum Building TCM Solutions Version Best Practices Guide 103
Performance and Scalability
<layout class="org.apache.log4j.PatternLayout">
<!-- The default pattern: Date Priority [Category] Message\n -->
<param name="ConversionPattern" value="%d %-5p [%c] %m%n"/>
If you are using the JBoss Application Server, turn off debugging. By default, the debug
mode is turned on. This results in unnecessary IO and CPU resource consumption. To turn
off debugging, copy and paste the following to the xml file: C:\Documentum\jboss4.2.
0\server\DctmServer_MethodServer\conf\jboss-log4j.xml
<appender name="FILE" class="org.jboss.logging.appender.DailyRollingFileAppender">
<errorHandler class="org.jboss.logging.util.OnlyOnceErrorHandler"/>
<param name="File" value="${jboss.server.log.dir}/server.log"/>
<param name="Threshold" value="INFO"/>
<param name="Append" value="false"/>
<layout class="org.apache.log4j.PatternLayout">
<!-- The default pattern: Date Priority [Category] Message\n -->
<param name="ConversionPattern" value="%d %-5p [%c] %m%n"/>
104 EMC Documentum Building TCM Solutions Version Best Practices Guide
Performance and Scalability
available sessions on the Content Server, which increases the Content Server CPU utilization. This is
the recommended trade-off since the Content Server is a scalable tier.
EMC Documentum Building TCM Solutions Version Best Practices Guide 105
Performance and Scalability
106 EMC Documentum Building TCM Solutions Version Best Practices Guide
Chapter 8
Deploying the Application
After the application has been developed and tested, it is ready for the production environment. The
environment in which process-based applications are put into production generally differ from the
development environment. The production environment usually has more users and, therefore,
requires more hardware components, software modules, and databases. In this document, the term
deployment means the transfer of applications between environments: from development to test or
from test to production, with as many intermediate environments as necessary. In the general setting,
you deploy applications from a source environment to a target environment.
The deployment process begins by preparing the target environment. To prepare the target
environment, EMC Documentum products such as Content Server, Process Engine, Process
Builder, and BAM that were installed in the source environment must also be installed in the target
environment. The next step is to define the users in the target environment. In most cases, the users
in the target environment are different from the users in the source environment.
To deploy:
1. Install TaskSpace and other Process Suite products in the target environment.
2. Create a Composer project. Add the SDTs that you used to create BAM reports to this project.
Build a DAR file. Install it on the target environment.
3. Import the TCMReferenceProject into your Composer workspace.
4. Start BAM in the production environment. The report entities corresponding to the SDTs you
imported in Step 1 will be populated in the Documentum repository.
5. If you created BAM custom entities, they are not migrated by using Composer. When you created
those entities you would have run one or more DQL scripts. Save the DQL scripts as a text
file. At deployment time, manually run those scripts in the target environment to create your
custom entities in the BAM database.
EMC Documentum Building TCM Solutions Version Best Practices Guide 107
Deploying the Application
6. Create a Composer project. Name the project exactly the same as the name of the TaskSpace
application. Be sure to reference the TCMReferenceProject in the new project. (One of the steps in
the wizard prompts the user to select the reference project.)
Note: The TCMReferenceProject is a Composer Project that contains the artifacts that are needed
when you import a TaskSpace application, form template, or process template. At import time,
when a user wants to import a TaskSpace application into Composer, the TCMReferenceProject
must be in that user’s workspace. If a TaskSpace Application is imported into Composer, and
the TCMReferenceProject has not been specified as a reference project, the user receives the
following error:
Type name is invalid. Type names must not begin with ‘dm.’
For more information, see the Reference Projects section
of the Composer User Guide.
7. In Composer, select the option to import a TaskSpace application. Perform the import. This
action automatically pulls in all related artifacts:
• The associated process (from Process Builder), associated forms, roles, tabs
• Associated BAM dashboards (defined as tabs in TaskSpace)
• First-level BAM reports on these dashboards (First-level reports are not defined as drilldown
reports)
8. Manually import the drilldown reports. This action is required because the drilldown reports
you created in your dashboards are not automatically imported into the project. Also, import the
BAM Configuration artifact that contains various settings used by BAM (such as time settings for
the gap filler).
9. Build the DAR file for this project.
10. Install the DAR file into the target environment. Do this by:
• Using the DAR installer, or
• In Composer, right-click the project and select Install Documentum Project from the context
menu. (Alternatively, you can install using ANT scripts).
Composer version
Use the latest version of Composer including any hot fixes that are available.
108 EMC Documentum Building TCM Solutions Version Best Practices Guide
Deploying the Application
Database references
Whenever database references change as you move from the source to the target environment, these
references must be manually updated to the new database reference. Do this after you install your
application into the production environment.
EMC Documentum Building TCM Solutions Version Best Practices Guide 109
Deploying the Application
110 EMC Documentum Building TCM Solutions Version Best Practices Guide
Glossary
Activity templates
An activity template contains logic to interact with (invoke service or receive message from
service) a specific type of service. For example, a Web Service activity template can be used
to invoke any web service. Activity templates are global and can be used in any number of
processes.
Child application
A TaskSpace application based on (created from) another TaskSpace application. An
application inherits all configuration settings from its parent application. New components,
tabs, or roles can be added or created for the new application. Changes made in the child
application have no effect in the parent application.
Correlation
In asynchronous interactions between applications, the caller application sends a request to
the service and the service responds back after some delay. The caller application could have
sent many requests at the same time and can be waiting for responses for multiple requests.
Once receiving a response the caller application has to match the received response to the
original request. The process of identifying the request (business transaction) for the request
is called correlation.
Correlation set
The correlation set is collection of process variable attributes that can be used to uniquely
identify a business transaction. For example, in the purchase order process one can uniquely
identify the purchase order transaction to purchase order number. Or in case of loan process
one can identify the business transaction started for processing a loan application by social
security number (ssn) of customer and property address. A process can have zero or more
correlation sets associated with it.
Configuration tab
A tab in TaskSpace that provides functionality for configuring components, processes, tabs,
roles, and other application settings.
Dashboard
An interface, created in Business Activity Monitor (BAM) or TaskSpace, with visual indicators
for process performance monitoring.
Dashboard component
A dashboard that has been added to a TaskSpace application for process performance
monitoring.
EMC Documentum Building TCM Solutions Version Best Practices Guide 111
Glossary
Dshboard tab
A tab in TaskSpace that displays one or more dashboard components for process performance
monitoring.
Package
A package is a named reference to an object in Documentum repository.
Process
Defines the template for composite application. It contains a set of connected activities that
are executed by the Process Engine.
Process variable
Process variable is a named reference to transient data used in a process. Process variables
can either be of simple type (string, integer, Boolean, float or data) or can be structured data
types. The lifecycle of process variables is managed by the Process Engine. For example,
when a process instance or business transaction is created the Process Engine creates a copy
of object (simple or SDT) for use within the business transaction and these objects are deleted
when business transaction finishes. Process designers can optionally specify the permission
set that should be associated with these process variables. This is useful when designers want
to restrict access (read/ write) to these objects for a selected set of users
Manual activity
Manual activities are managed by a human workflow service. The Process Engine creates a
task that is assigned either to a specific user, group, or work queue. The process waits and
moves to the next activity only after the task (corresponding to manual activity) is finished.
112 EMC Documentum Building TCM Solutions Version Best Practices Guide
Glossary
Outbound activity
Outbound activities invoke a service. An outbound activity (created by using one of the
activity templates) is configured by specifying values for service-specific configuration
parameters (for a web service – it is WSDL URL, Port Type, Operation), mapping rules for
creating a service-specific message (SOAP message in case of web service) from the process
data model, and rules for processing the response. Process Engine executes the outbound
activity and moves to the next activity only after successful invocation of the service.
EMC Documentum Building TCM Solutions Version Best Practices Guide 113
Glossary
114 EMC Documentum Building TCM Solutions Version Best Practices Guide
Index
EMC Documentum Building TCM Solutions Version Best Practices Guide 115
Index
application, 107 J
Composer version, 108 JVM capacity, 103
steps, 107
deployment, 107
Composer version, 108 L
deleting SDTs, 109 log messages, 61
migrating, 60 logins
overwriting existing process, 110 performance, 102
planning, 26 looping
process templates, 109 count children activity, 44
setting up, 18 defining, 41
updating database references, 109 simple loop, 42
upgrading, 60 wait activity, 44
versioning process template, 110
DFC trace, 105
document view, 71
M
documents message correlation, 35
displaying inline, 71 migrating, 60
monitoring
SDTs, 57
E
EMC products, 13
environment
O
application server settings, 102 object types
content server settings, 103 custom object types, 41
database server settings, 104
environments P
purging, 92
package attributes
error messages, 74
monitoring, 30
using, 29
F packages, 27
folders description, 28
displaying inline, 72 displaying multiple inline, 72
form templates electronic forms, 70
creating, 63 enabling reporting, 57
forms performance
electronic forms, 70 analyzing, 105
performance, 84, 100 automated workflow activities, 59
configuration recommendations, 99
form templates, 84
G impactors to, 100
grants management application, 12 large configurations, 99
logins, 102
medium configurations, 99
I pre-conditions, 101
implementing report performance, 94
TaskSpace, 85 SDTs, 59
initiate process template, 69 search criteria, 100
search forms, 100
skill set matching, 102
116 EMC Documentum Building TCM Solutions Version Best Practices Guide
Index
EMC Documentum Building TCM Solutions Version Best Practices Guide 117
Index
using, 85 U
task view upgrading, 60
designing, 71 user interface
tasks creating, 63
sending to temporary group, 32 design tips, 63
TaskSpace form templates, 63
adding buttons to a task form, 85 search templates, 64
assigning current object actions to
menus, 87
current object actions, 86 V
Finish button failing on a task form, 86 versioning
implementing, 85 process template, 110
iterative rollout of an application, 85 processes, 60
metadata pane in a folder view, 86
pre-conditions, 101
selected object actions, 87
W
task lists, 101 web service
tomcat server invoking a secure service, 32
settings, 102 Workflow
automated activities, 59
118 EMC Documentum Building TCM Solutions Version Best Practices Guide