Академический Документы
Профессиональный Документы
Культура Документы
1
New Features and Enhancements
This section describes new features and enhancements in PowerCenter 8.1.1:
♦ Code Pages, 1
♦ Command Line Programs, 1
♦ Data Analyzer (PowerAnalyzer), 2
♦ Data Profiling, 2
♦ Data Stencil (Visio Client for PowerCenter), 3
♦ Domains, 3
♦ Integration Service, 4
♦ Metadata Manager (SuperGlue), 6
♦ Repository and Repository Service, 7
♦ Security, 7
♦ Transformations and Caches, 7
♦ Unstructured Data, 8
♦ Web Services Provider, 8
♦ Workflow Monitor, 8
♦ XML, 8
♦ PowerCenter Connects, 9
♦ PowerExchange Client for PowerCenter, 10
You can find information about newly supported platforms and databases on
my.informatica.com.
Code Pages
PowerCenter 8.1.1 supports the following code pages:
♦ IBM833
♦ IBM 834
♦ IBM850
♦ IBM933
infasetup Changes
The infasetup command line program contains a DeleteDomain command.
pmrep Changes
The pmrep command line program contains new commands to manage connections and
versions. New commands are GetConnectionDetails, ListConnections, PurgeVersion, and
DeleteObject. Changed commands include ObjectExport, ObjectImport,
UpdateSeqGenValues, ListAllUsers, and TruncateLog.
Data Profiling
♦ Partitioning. You can create multiple partitions for profile sessions containing source-level
and column-level functions.
Domains
This section describes changes to domains.
Alerts
♦ Subscribe to alerts. You can subscribe to alerts to receive email notification about node
events such as node failure or a master gateway election. When you subscribe to alerts, you
receive notifications for all domains and services you have permission on.
Users
♦ Set permissions by user. You can view and edit permissions and privileges for each user on
the Administration tab. You can assign multiple permissions to a user when you add a user
account or edit the permissions.
♦ Status added to each log event. The User Activity Monitoring report shows the status of
each log event. The report indicates whether each log event succeeded or failed.
High Availability
♦ Resilience for sources and Lookup transformations. The Integration Service is resilient to
temporary network failures or relational database unavailability when the Integration
Service retrieves data from a database for sources and Lookup transformations. If the
Integration Service loses the connection after making the initial connection, it attempts to
reconnect for the amount of time configured for the retry period in the connection object.
♦ Operating mode. You can run the Integration Service in safe mode to limit access to the
Integration Service during deployment or maintenance. You can enable the Integration
Service in safe mode and you can configure the Integration Service to run in safe mode on
failover.
Load Balancer
You can configure the way the Load Balancer dispatches tasks by defining the following
properties:
♦ Resource provision thresholds. You can configure resource provision thresholds for each
node in a grid to cause the Load Balancer to consider or eliminate a node for the dispatch
of a task. You can define the Maximum CPU Run Queue Length and Maximum
Memory % thresholds in addition to Maximum Processes.
♦ Dispatch mode. You can configure the way the Load Balancer dispatches tasks by setting
the dispatch mode for the domain. You can choose round-robin, metric-based, or adaptive
dispatch mode.
♦ Service levels. The Load Balancer uses service levels to determine the order in which to
dispatch tasks from the dispatch queue. You create service levels in the Administration
Console. When you configure a workflow, you assign a service level to the workflow.
Operating Mode
♦ Normal and safe operating mode. You can run the Integration Service in normal or safe
operating mode. Run the Integration Service in safe operating mode to limit access to the
Integration Service during migration and maintenance activities. You configure the
operating mode in the Administration Console.
Section Scope
[Global] Parameters and variables defined in this section apply to all Integration
Services and Integration Service processes in the domain, plus all
workflows, worklets, and sessions.
[Service:Integration_Service_name] Parameters and variables defined in this section apply to the named
Integration Service, plus all workflows, worklets, and sessions the service
runs.
[Service:Integration_Service_name. Parameters and variables defined in this section apply to the Integration
ND:node_name] Service that runs on the named node (the Integration Service process),
plus all workflows, worklets, and sessions the service process runs.
The parameters and variables you define in these sections apply when you run a workflow
or session that uses the parameter file.
Email task (address, subject, and body) Service variables, service process variables,
workflow variables
Suspension email (address, subject, and body) Service variables, service process variables,
workflow variables
♦ Workflow variables in sessions. You can use workflow variables in sessions. If you do this,
the Integration Service treats them as parameters. Therefore, the values do not change
when the session runs.
Performance
♦ Pushdown optimization. You can push transformation logic to the source or target
database when the Lookup transformation contains a lookup override. To perform source-
side, target-side, or full pushdown optimization for a session containing lookup overrides,
configure the session for pushdown optimization and select a pushdown option that allows
you to create views. You can also use full pushdown optimization when you use the target
loading option to treat source rows as delete.
♦ Sorter transformation. The Sorter transformation has been enhanced to use a more
compact format which will significantly reduce the temporary disk space necessary to sort
large a data set that does not fit into in-memory sort. In addition, the sort algorithm is
improved to significantly reduce sort time.
Targets
♦ Generate flat file targets by transaction. You can generate a separate flat file target each
time the Integration Service starts a new transaction. You can dynamically name each file.
To generate a separate output file for each transaction, you add a FileName port to the flat
file target definition. When you connect the FileName port in the mapping, the
Integration Service writes a target file at each commit. The Integration Service uses the
FileName port value to name the output file. You can generate output files from source-
based or user-defined commits.
Administration
♦ Authentication. You can use LDAP for Metadata Manager on JBoss Application Server,
WebLogic Application Server, or WebSphere Application Server.
♦ Clustering. You can use clusters for the Metadata Manager application on JBoss
Application Server, WebLogic Application Server, or WebSphere Application Server. You
use clusters to distribute the Metadata Manager application load across multiple machines.
Clustering may increase performance when the application has a large number of
concurrent users.
♦ Secure Sockets Layer (SSL). You can configure Metadata Manager to use the SSL protocol
to provide a secure and authentic connection between the Metadata Manager server and a
web browser client.
Installation
♦ Sybase ASE. You can install the Metadata Manager Warehouse and PowerCenter
repository for Metadata Manager on Sybase ASE.
Utilities
♦ EAR Repackager utility. The EAR Repackager utility extracts files from and repackages
customized files to the Metadata Manager EAR file. You may need to customize files in the
Metadata Manager EAR file to customize the Metadata Manager images, color schemes,
style sheets, or configuration files.
♦ License Update utility. The License Update utility updates the Metadata Manager license.
You may need to update an existing license with a new license key to extend the expiration
date.
XConnects
♦ Business Objects XConnect. The Business Objects XConnect extracts metadata from
Business Objects XI.
♦ PowerCenter XConnect. You can configure the PowerCenter XConnect to capture
information about flat file sources, lookups, and targets that are specified in parameters
files. You can view the flat files in data lineage and where-used analysis.
Security
♦ IBM DB2 client authentication. You can use IBM DB2 client authentication to
authenticate database users. IBM DB2 client authentication lets you log in to an IBM
DB2 database without specifying a database user name or password if the IBM DB2 server
is configured for external authentication or if the IBM DB2 server is on the same machine
hosting the Integration Service process. PowerCenter uses IBM DB2 client authentication
when the connection user name is PmNullUser and the connection is for an IBM DB2
database.
Unstructured Data
♦ Unstructured Data transformation. You can run the Unstructured Data transformation
on a 64-bit Integration Service on HP-UX (Itanium) and Solaris.
http://<hostname>:<portnum>/wsh
Workflow Monitor
♦ View binary log files. You can view existing binary log files for sessions and workflows in
the Log Events window.
XML
♦ XML foreign key column names. You can configure the Designer to prefix generated
foreign key columns with the name of the XML view and the name of the primary key
column.