Вы находитесь на странице: 1из 210

PACK4SAPR3.

book Page i Thursday, January 15, 2004 11:31 AM

Ascential
PACK for SAP R/3

Part No. 00D-026SAP502


Version 5.0.2
June 2004
PACK4SAPR3.book Page ii Thursday, January 15, 2004 11:31 AM

Copyright © 2001–2004 Ascential Software Corporation. All rights reserved.

Ascential, DataStage, and MetaStage are trademarks of Ascential Software Corporation or its affiliates and may be
registered in the United States or in other jurisdictions.

ABAP, BW, R/3, and SAP are registered trademarks of SAP AG.

Microsoft, Windows, and Windows NT are registered trademarks of Microsoft Corporation in the United States and
other countries.

UNIX is a registered trademark in the United States and other countries, licensed exclusively through X/Open
Company, Ltd.

Other marks are the property of the owners of those marks.

This product may contain or utilize third party components subject to the DataStage user documentation previously
provided by Ascential Software Corporation or contained herein.
PACK4SAPR3.book Page iii Thursday, January 15, 2004 11:31 AM

PACK for SAP R/3


Table of Contents

Preface
Organization of This Manual .......................................................................................... vii
Documentation Conventions .......................................................................................... viii
DataStage Documentation ............................................................................................. viii

Chapter 1. Installation
Platforms ........................................................................................................................ 1-1
Software Requirements .................................................................................................. 1-1
Installing the Server Component on Windows .............................................................. 1-3
Installing the Server Component on UNIX ................................................................... 1-4
Installing the Client Component .................................................................................... 1-5
Terminology ................................................................................................................... 1-5

Chapter 2. The ABAP Plug-In


Functionality .................................................................................................................. 2-2
Installation Prerequisites ................................................................................................ 2-3
Integrating DataStage with SAP R/3 Systems ............................................................... 2-4
Importing the RFC and Authorization Profile ........................................................ 2-4
Defining Stage Properties ....................................................................................... 2-7
Defining Character Set Maps .................................................................................. 2-8
Defining Output Properties ..................................................................................... 2-8
The Output General Page ............................................................................................... 2-9
Defining SAP Connection and Logon Details ...................................................... 2-11
Defining Connection Properties ........................................................................... 2-12
Selecting the Data Transfer Method ............................................................................ 2-13
The Output ABAP Program Page ................................................................................ 2-18
Building SQL Queries ................................................................................................. 2-22
Build Open SQL Query Tables Page .................................................................... 2-23

Table of Contents iii


PACK4SAPR3.book Page iv Thursday, January 15, 2004 11:31 AM

Build Open SQL Query Select Page .....................................................................2-26


Build Open SQL Query Where Page ....................................................................2-28
Build Open SQL Query Having Page ...................................................................2-29
Build Open SQL Query Order By Page ................................................................2-29
Build Open SQL Query SQL Page .......................................................................2-30
Building Extraction Objects .........................................................................................2-31
Connecting to the Server .......................................................................................2-31
Searching for Tables ..............................................................................................2-32
Specifying the SQL Condition for Extraction Objects .........................................2-41
Using Job Parameters ............................................................................................2-43
The Program Generation Options Dialog .....................................................................2-45
The Output Columns Page ...........................................................................................2-46
Completing the Job .......................................................................................................2-46
SAP R/3 Table Type Support .......................................................................................2-46
SAP R/3 Data Type Support .........................................................................................2-47
Selection Criteria Data Types ................................................................................2-47
SAP R/3 to DataStage Data Type Conversion ......................................................2-49
Validating Run-time Jobs ......................................................................................2-49

Chapter 3. The IDoc Extract Plug-In


Introduction ....................................................................................................................3-1
Functionality ...................................................................................................................3-2
Using DataStage to Process SAP IDocs .........................................................................3-2
Runtime Components .....................................................................................................3-5
Listener Sub-System ......................................................................................................3-5
Listener Manager ....................................................................................................3-6
Listener Server ........................................................................................................3-6
Job Execution Mode ................................................................................................3-8
Persistent Staging Area ...........................................................................................3-8
Configuration Files ............................................................................................... 3-11
IDoc Extract for SAP R/3 Plug-in Stage ......................................................................3-15
Bookmarking the PSA ..........................................................................................3-19
Creating DataStage Jobs for IDoc Types .....................................................................3-19
Defining the DataStage Connection to SAP .................................................................3-20
About the Stage General Page ...............................................................................3-21

iv PACK for SAP R/3


PACK4SAPR3.book Page v Thursday, January 15, 2004 11:31 AM

Selecting IDoc Types ................................................................................................... 3-23


Defining IDoc Type Properties ............................................................................. 3-26
Specifying R/3 Versions ....................................................................................... 3-29
Defining IDoc Extract Output Links ........................................................................... 3-31
About the General Tab for the Output Page ......................................................... 3-32
Adding Columns ................................................................................................... 3-33
Saving IDoc Meta Data Definitions ..................................................................... 3-34

Chapter 4. The IDoc Load Plug-In


Introduction .................................................................................................................... 4-1
Configuration Requirements ................................................................................... 4-3
IDoc Load for SAP R/3 Stage ........................................................................................ 4-3
Selecting the DataStage Connection to SAP ................................................................. 4-4
About the Stage General Page ................................................................................ 4-6
Selecting IDoc Types ................................................................................................... 4-10
About the Stage IDoc Type Page .......................................................................... 4-12
Defining Character Set Mapping ................................................................................. 4-13
Defining IDoc Load Input Links ................................................................................. 4-13
Modifying Columns .............................................................................................. 4-16
Synchronizing Columns ....................................................................................... 4-20

Chapter 5. The BAPI Plug-In


Introduction .................................................................................................................... 5-1
Functionality .................................................................................................................. 5-2
Building a Job ................................................................................................................ 5-3
Defining Stage Properties .............................................................................................. 5-4
Defining Character Set Maps .................................................................................. 5-5
Defining SAP Connection and Logon Details ........................................................ 5-6
Selecting the DataStage Connection to SAP .......................................................... 5-7
Defining Connection Properties ............................................................................. 5-8
Defining Input Properties ............................................................................................... 5-9
Input General Page ............................................................................................... 5-10
Filtering Business Objects .................................................................................... 5-10
BAPI Explorer Dialog .......................................................................................... 5-11
Defining BAPI Interfaces ..................................................................................... 5-13

Table of Contents v
PACK4SAPR3.book Page vi Thursday, January 15, 2004 11:31 AM

Defining Log Files ................................................................................................5-16


Input Columns Page ..............................................................................................5-18
Defining Output Properties ..........................................................................................5-19
Output General Page .............................................................................................5-19
Output BAPI Page .................................................................................................5-20
Output Read Logs Page .........................................................................................5-23
Output Columns Page ...........................................................................................5-25
Completing the Job .......................................................................................................5-26
Run-Time Component ..................................................................................................5-26

Chapter 6. The Administrator for SAP Utility


About the DataStage Connections to R/3 Page .......................................................6-2
About the IDoc Cleanup and Archiving Page .......................................................6-15

Appendix A. SAP Authorization Requirements for ABAP


SAP Authorization Requirements ................................................................................. A-1
Creating an Authorization Profile Manually .......................................................... A-2
Installing the RFC Function Code Manually ................................................................ A-5
Configuring the SAP/Dispatch/Gateway Service ................................................ A-12
Troubleshooting for RFC Installation for SAP R/3 ............................................. A-17

Appendix B. Packaged Extractions for ABAP


Appendix C. Properties for the IDoc Extract Plug-In
Appendix D. File Permissions for IDoc Extract Plug-In
Configuring Connections and IDoc Types .................................................................... D-1
Running DataStage jobs ......................................................................................... D-3
Unconfigured IDoc types ....................................................................................... D-4
Configuring IDoc Archival and Cleanup ............................................................... D-4
Recommendations .................................................................................................. D-4

Index

vi PACK for SAP R/3


PACK4SAPR3.book Page vii Thursday, January 15, 2004 11:31 AM

Preface

This manual describes and explains the use of Version 5.0 of the Ascential PACK
for SAP R/3.
If you are new to DataStage, read the DataStage Designer Guide and the DataStage
Manager Guide. These give descriptions of the DataStage Designer and Manager,
and help you get started.

Organization of This Manual


This manual contains the following:
Chapter 1 describes the installation requirements for version 5 of the Ascential
PACK for SAP R/3 for DataStage 6.0 or later. It also describes the installation
and upgrade procedures for the server and the client components on Windows
and UNIX platforms. It introduces the terminology used in this manual.
Chapter 2 introduces you to Version 5.0 of the Ascential PACK for SAP R/3
and the ABAP Extract plug-in. It describes the functionality for this plug-in
and describes how to create a job using the ABAP Extract plug-in.
Chapter 3 introduces you to the IDoc Extract plug-in. It describes the function-
ality for this plug-in and how to create a job using the IDoc Extract plug-in.
Chapter 4 introduces you to the IDoc Load plug-in. It describes the function-
ality for this plug-in and how to create a job using the IDoc Load plug-in.
Chapter 5 introduces you to the BAPI plug-in. It describes the functionality for
this plug-in and how to create a job using the BAPI plug-in.
Chapter 6 describes the DataStage Administrator for SAP utility. It discusses
the DataStage Connections to BW, the DataStage Connections to R/3, and the
IDoc Cleanup and Archiving pages.
Appendix A describes the SAP Authorization Requirements for ABAP.
Appendix B discusses the packaged extractions for ABAP.
Appendix C describes the properties for the IDoc Extract plug-in.
Appendix D discusses the file permissions for the IDoc Extract plug-in.

Organization of This Manual vii


PACK4SAPR3.book Page viii Thursday, January 15, 2004 11:31 AM

Documentation Conventions
This manual uses the following conventions:

Convention Usage
Bold In syntax, bold indicates commands, function names, and
options. In text, bold indicates keys to press, function names,
and menu selections.
UPPERCASE In syntax, uppercase indicates commands, keywords, and
options; statements and functions; and SQL statements and
keywords.
Italic In syntax, italic indicates information that you supply. In text,
italic also indicates UNIX commands and options, filenames,
and pathnames.
Courier Courier indicates examples of source code and system
output.
Courier Bold In examples, courier bold indicates characters that you type
or keys you press (for example, <Return>).
[ ] Brackets enclose optional items. Do not type the brackets
unless indicated.
{} Braces enclose nonoptional items from which you must
select at least one. Do not type the braces.
itemA | itemB A vertical bar separating items indicates that you can choose
only one item. Do not type the vertical bar.
... Three periods indicate that more of the same type of item can
optionally follow.
➤ A right arrow between menu options indicates you should
choose each option in sequence. For example, “Choose
File ➤ Exit” means you should choose File from the menu
bar, then choose Exit from the File menu.

DataStage Documentation
DataStage core documentation is available online in PDF format. You can read
them with the Adobe Acrobat Reader supplied with DataStage. See Ascential
installation documentation for details on installing the manuals and the Adobe
Acrobat Reader.
Online help is also supplied for DataStage and the Ascential PACK for SAP R/3.

viii PACK for SAP R/3


PACK4SAPR3.book Page 1 Thursday, January 15, 2004 11:31 AM

1
Installation

Install server and client components for Version 5 of the Ascential PACK for SAP
R/3 on the DataStage server and client systems respectively. This PACK includes
the following plug-ins and utility:
• ABAP Extract. Lets DataStage extract data from the R/3 Repository using
the ABAP extraction program generated by the plug-in.
• IDoc Extract. Lets DataStage capture IDocs from R/3 source systems to be
used as source data for DataStage job data streams.
• IDoc Load. Generates IDocs to load data into SAP R/3
• BAPI. Loads data into and extracts data from SAP R/3 Enterprise.
• Administrator for SAP. Manages the configurations of R/3 connection and
IDoc type objects.

Platforms
Because the plug-ins for the Ascential PACK for SAP R/3 are packaged together,
you install all plug-ins (you cannot optionally install an individual plug-in). The
product is distributed in one of the following ways, depending on your platform:
• Windows. One CD-ROM.
• UNIX. Two CD-ROMs. The server components are distributed on a sepa-
rate CD-ROM from the client component. Please ensure that you have also
received the Windows CD-ROM to install the DataStage client components.

Installation 1-1
PACK4SAPR3.book Page 2 Thursday, January 15, 2004 11:31 AM

Software Requirements
For information about configuration requirements for DataStage and the latest
information about DataStage, see the instructions supplied with your DataStage
installation CD and the online readme.txt file for your platform. For other installa-
tion prerequisites, see the respective sections in Chapter 2 for the ABAP Extract
plug-in, Chapter 3 for IDoc Extract, Chapter 4 for IDoc Load, and Chapter 5 for
BAPI.
The following are required for the Ascential PACK for SAP R/3, depending on
your platform:
• Windows. Use one of the following:
Windows NT 4.0 with Service Pack 6A or later
Windows 2000 Professional/Server/Adv. Server with Service Pack 2 or
later or
Windows XP Professional
• UNIX. Use one of the following:
Sun Solaris 2.7, 2.8
IBM AIX 4.3.3, 5.1
HP HP-UX 11.0, 11i
Linux RedHat Linux 7.3
HP/Compaq Tru64 (on request)
• SAP R/3 4.0B, 4.5B, 4.6C, 4.7 or later
• SAP RFC library:
DataStage client and server (Windows). SAP RFC client library librfc32.dll
6.10 or later.
DataStage server (UNIX). Thread-safe, shared SAP client library, 6.10 or
later. The name varies depending on your platform, for example:

If your platform is… Use this library…


Solaris librfccm.so
HP-UX librfccm.sl
AIX librfccm.o
Linux librfccm.so
Tru64 librfccm.so

Note: The SAP RFC librfc32.dll library is no longer included with


Windows client installations. You must ensure that this library is

1-2 PACK for SAP R/3


PACK4SAPR3.book Page 3 Thursday, January 15, 2004 11:31 AM

installed in the Windows system directory before you install the


DataStage components.

If you are installing the client and server components of the plug-ins
on Windows, librfc32.dll must exist in the Windows system directory.
If you install the client on Windows and the server on Unix,
librfc32.dll must still exist on the Windows system. Additionally, you
must ensure that the library (the name depends on the specific UNIX
platform) exists in the <dshome>/lib directory, where <dshome>
corresponds to the DataStage home directory. (We recommend that
the library file reside in your user environment.)
To find and change to your DataStage home directory, enter the
following at the UNIX prompt:
# cd `cat /.dshome`
pwd displays the working directory, for example:
# pwd
/u1/dsadm/Ascential/DataStage/DSEngine
# cd lib
# pwd
/u1/dsadm/Ascential/DataStage/DSEngine/lib
If the SAPGUI front-end has been installed on your Windows NT
system, this library may already be installed. However, you should
check it to insure that it is the correct version.
If the front-end is not installed, see your SAP Administrator to
obtain this library from SAP. See Note 19466 on the SAP Service
Marketplace web site for details about obtaining downloads. Addi-
tionally, see “Configuring the SAP/Dispatch/Gateway Service” on
page A-12 for additional configuration requirements.
• DataStage 6.0 or later on the DataStage client and server machines
Install the server component from the CD-ROM as described in the following
sections for Windows NT and UNIX. To install the client component, see
“Installing the Client Component” on page 1-7.

Note: Before installing the Ascential PACK for SAP R/3, you should shut down
all SAP-related software, specifically: SAP Frontend, SAP Internet
Graphics Server service (if present). Otherwise, errors can occur during
the install.

Installation 1-3
PACK4SAPR3.book Page 4 Thursday, January 15, 2004 11:31 AM

Installing the Server Component on Windows


You need Windows Administrator privileges to install the RFC Listener Server
and Server Manager components on Windows platforms.
To install these components:
1. Load the CD-ROM containing the Ascential PACK for SAP R/3 server
components.
2. Run setup.exe from the server directory on the CD-ROM.
3. Accept the default settings for all the screens. Click Next for each panel.
4. If the default settings are accepted, the setup installs the plug-ins into all
projects by automatically launching the DataStage Package Installer. If the
installation appears to hang, check the task bar for the Package Installer
window and activate it. Respond to any prompts. Normally the Package
Installer operates in silent mode, but it may require user input under some
circumstances.

To Install Plug-ins at a Later Time on Windows


Normally, you install these plug-ins using the instructions described in the
previous section. However, if you reinstall these plug-ins at any time, simply
invoke the DataStage Package Installer.
You need Windows Administrator privileges to install these plug-ins on
Windows. To install these components:
1. Mount the CD-ROM containing the Ascential PACK for SAP R/3.
2. Choose Start ➤ Programs ➤ Ascential DataStage ➤ DataStage Package
Installer. The DataStage Package Installer wizard appears.
3. In the Package Directory field, enter a pathname or click Browse to search for
the runtime component, for example, F:\ABAP_EXT_PACK.
4. Continue with the installation. Accept the default values by clicking Next for
each panel.
5. Click Finish when you reach the last panel.

Installing the Server Component on UNIX


This section describes the installation of the Ascential PACK for SAP R/3 on
UNIX systems. These instructions apply to the DataStage ABAP Extract for SAP

1-4 PACK for SAP R/3


PACK4SAPR3.book Page 5 Thursday, January 15, 2004 11:31 AM

R/3, the IDoc Extract for SAP R/3, the IDoc Load for SAP R/3, and the BAPI for
SAP R/3 plug-ins, described in Part 1, Part 2, and Part 3 respectively of this tech-
nical bulletin.
1. Log in to the DataStage server host system as root or dsadm.
2. Mount the CD-ROM containing the server components.
3. Change directories to the mounted CD-ROM.
4. Run the installation script. Depending on your platform, the format of
files on the CD-ROM may differ, for example:

If your platform is… Enter…


Solaris ./install.sh
AIX ./install.sh
Linux ./install.sh
HP-UX ./INSTALL.SH\;1
Tru64 ./INSTALL.SH\;1

The installation script installs the RFC listener sub-system first. If an entry in
the dsenv file for DSSAPHOME does not exist, you are prompted to add it, for
example:
DSSAPHOME=/u1/dsadm/Ascential/DataStage;export DSSAPHOME
For more information about the dsenv file, see DataStage Install and Upgrade
Guide.
5. Install the plug-ins (they are all automatically installed and all projects
are updated).
See DataStage Manager documentation for information about registering
plug-ins for new projects.

To Install Plug-ins at a Later Time on UNIX


1. As root, change directories to the bin subdirectory of the DataStage home
directory. The location of this directory is stored in /.dshome. For example,
enter cd ‘cat .dshome‘/bin to go to the DataStage home bin directory.
2. Run the dspackinst command. When prompted to enter a package directory,
enter the name of the directory where the plug-ins reside, typically on the CD.

Installation 1-5
PACK4SAPR3.book Page 6 Thursday, January 15, 2004 11:31 AM

Depending on your platform, the format of files on the CD may differ, for
example, on AIX, the plug-ins are in /cdrom/packages.
Old DataStage jobs using the plug-ins are upgraded for compatibility with
the new version of the plug-ins.

Note: Do not use dspackinst to perform the initial installation of the plug-ins.

If you must install the ABAP Extract plug-in, you cannot use dspackinst. The plug-
in must be installed by registering it with the DataStage Manager.
To register the ABAP Extract plug-in:
1. From the DataStage client, start the DataStage Manager and log in to the
desired project.
2. Select Tools ➤ Register Plug-In… from the menu bar.
3. In the dialog, browse or enter the path of the plug-in, for example,
dsr3enu.so (or dsr3jpn.so for a Japanese installation).
4. Enter Server\PACKS in the Category field.
5. Select the Parallel stage type required check box.
6. Click OK.

7. Change the Name suffix from PX to _PX.

1-6 PACK for SAP R/3


PACK4SAPR3.book Page 7 Thursday, January 15, 2004 11:31 AM

8. Click OK.

Installing the Client Component


You must be a Windows Administrator to install the client component. To take
advantage of the plug-in GUI features, you must install it on each DataStage client
system as follows:
1. In the client folder on the CD, double-click setup.exe.
2. Optionally create a shortcut if you are prompted to create a desktop
shortcut for the additional client DataStage Administrator for SAP utility.
You can access this utility using the shortcut or the Ascential DataStage
program group.

Note: You must use a username and non-blank password for DataStage logon
credentials to use the IDoc plug-ins. This means you must clear Omit on
the Attach to Project dialog in DataStage, otherwise, unexpected results
occur.

Call Ascential Technical Support at 866.INFONOW (866.463.6669) if you have


questions.

Installation 1-7
PACK4SAPR3.book Page 8 Thursday, January 15, 2004 11:31 AM

Terminology
The following table describes the terms used in the Ascential PACK for SAP R/3
to describe all plug-ins:

Term Description
ABAP Advanced Business Application Programming. The
language developed by SAP for application development
purposes. All R/3 applications are written in ABAP.
BAPI Business Application Programming Interface. A precisely
defined interface providing access to processes and data
in business application systems. BAPIs are defined as
API methods of SAP objects. These objects and their
methods are stored in the Business Objects Repository.
BOR Business Object Repository, which is the object-oriented
Repository in the SAP BW system. It contains, among
other objects, SAP Business Objects and their methods.
Business Object The representation of a business entity, such as an
employee or a sales order, in the SAP R/3 system.
Control record A special administrative record within an IDoc, one for
each IDoc. The control record contains a standard set of
fields that describe the IDoc as a whole.
ERP Enterprise Resource Planning business management
software.
IDoc Intermediate document. An IDoc is a report, that is, a
hierarchal package of related records, generated by SAP
R/3 in an SAP proprietary format. An IDoc, whose trans-
mission is initiated by the source database, exchanges
data between applications.
IDoc type Named meta data describing the structure of an IDoc that
is shared across databases. It consists of a hierarchy of
segment record types.
MATMAS01 An example of an IDoc type.
PACK Packaged Application Connection Kit. Accesses and
extracts data from and loads data to SAP R/3.
PSA Persistent Staging Area.
R/3 Real time/three tiers.

1-8 PACK for SAP R/3


PACK4SAPR3.book Page 9 Thursday, January 15, 2004 11:31 AM

Term Description
RFC Remote Function Call. The SAP implementation of RPC
(Remote Procedure Call) in ABAP. It calls a function
module that runs on a different system from the calling
function. The Remote Function Call can also be called
from within the same system, but usually the caller and
callee are dispersed.
RFM Remote Function Module. A function that belongs to a
BOR object type and has a BAPI method name.
SAP Systems, Applications, and Products in Data Processing.
SAP is a product of SAP AG, Walldorf, Germany.
SCM Supply Chain Management. The solution that tracks
financial, informational, and materials processes and
identifies processing exceptions.
Segment A record within an IDoc that is identified by a segment
number.
Segment type A named record definition for segments within an IDoc
that is one level in the hierarchy of segment types within
an IDoc type.
tRFC port Transactional RFC port.
Variant A collection of predefined criteria, similar to a group of
values used as parameters.
Variants are attached to various processes used by
DataStage for the ABAP program process, for example.
The ABAP Program referenced by the ABAP Program
process itself has a variant attached to it.)

Installation 1-9
PACK4SAPR3.book Page 10 Thursday, January 15, 2004 11:31 AM

1-10 PACK for SAP R/3


PACK4SAPR3.book Page 1 Thursday, January 15, 2004 11:31 AM

2
The ABAP Plug-In

This chapter describes the ABAP Extract plug-in, which is part of Version 5.0 of the
Ascential PACK for SAP R/3 for DataStage 6.0 or later. Use the ABAP Extract plug-
in to let DataStage extract data from the R/3 Repository using the ABAP extraction
program generated by the plug-in.
It describes the following for the ABAP Extract plug-in:
• Functionality
• Installation prerequisites
• Integrating DataStage with SAP R/3 systems
• Creating a DataStage job
• Defining ABAP Extract stage properties
• Extracting data from SAP R/3
• SAP R/3 table type support
• SAP R/3 data type support
DataStage provides enhanced SAP support with its Packaged Application
Connection Kit (PACK) for SAP R/3.
The ABAP Extract plug-in lets your company maximize its existing investments in
large ERP systems by complementing SAP standard software and consulting
services. It does this by automatically generating the ABAP program to extract
data from the R/3 Repository. The ABAP Extract plug-in lets users of all levels
efficiently build an extraction object, then generate an extraction program written
in the SAP proprietary ABAP programming language. Or, you can use an SQL
query to generate the ABAP program.
See“Terminology” on page 1-7 for a list of the terms used in this chapter.

The ABAP Plug-In 2-1


PACK4SAPR3.book Page 2 Thursday, January 15, 2004 11:31 AM

Functionality
The ABAP Extract plug-in has the following functionality and enhancements:
• Lets you choose and define SAP connections using the GUI. You can select
a DataStage connection to SAP instead of entering the information into the
stage interface.
• Uses the ABAP Program page for all ABAP-related functionality.
• Lets you view the development status of the ABAP program.
• Lets you view referenced and referring tables.
• Lets you use the SQL Query Builder or an extraction object to generate the
ABAP program.
• Lets you overwrite the ABAP program as you save it to R/3.
• Uses the Data Transfer Method page for the functionality of the data
transfer methods.
• Uses the functionality of the former Access page for RFC-connection fail-
ures as you exit the stage editor.
• Lets you synchronize and validate R/3 columns and DataStage columns.
• Supports NLS (National Language Support). For information, see
DataStage Server Job Developer’s Guide.
The following functionality is not supported:
• Data transformations or mappings. Use the Transformer stage to do this.
• The use of subqueries in the WHERE and HAVING clauses for SQL
queries.

2-2 PACK for SAP R/3


PACK4SAPR3.book Page 3 Thursday, January 15, 2004 11:31 AM

The functionality of the ABAP Extract plug-in can be represented in the following
diagram where the ABAP program is automatically generated to extract data from
the R/3 Repository.
R/3 Extraction Plug-In Architecture

SAP R/3 Application Server


RFC Run-Time/CPI-C Run-Time

1 4

GUI Client in Run-Time Svr


DataStage in DataStage
Designer Server
2 3

DataStage
Repository

The links do the following:


1. In Link 1, the GUI client component logs on to the SAP R/3 application server
and retrieves meta data information.
2. In Link 2, the GUI client component reads and writes job properties (for
example, extraction object, ABAP program, logon information, and so on)
between the GUI client component and the DataStage Repository.
3. In Link 3, the run-time server component reads job properties from the
DataStage Repository.
4. In Link 4, the run-time server component makes a remote function call
(RFC) to run the ABAP program and processes the generated dataset.

Installation Prerequisites
In addition to the software requirements for the Ascential PACK for SAP R/3
described on page 1-2, the following are required for installing the ABAP Extract
plug-in:
• FTP server on the machine where the temporary extraction files from the
SAP R/3 are stored (this must be the same machine as the SAP application
server)
• Database account with read access to the SAP R/3 data dictionary tables

The ABAP Plug-In 2-3


PACK4SAPR3.book Page 4 Thursday, January 15, 2004 11:31 AM

Integrating DataStage with SAP R/3 Systems


In addition to installing the client and server components of the Ascential PACK
for SAP R/3, you also need to integrate DataStage with the SAP R/3 system. Do
this by installing or creating the following two components on the SAP R/3
system:
• An RFC-enabled function module on the SAP R/3 system from which data
is extracted.
• An appropriate profile that can be assigned to the Ascential PACK for SAP
R/3 users. Profiles maintain a secure environment by providing the correct
authorizations to the users according to SAP practices.
The RFC source file (ZDSRFC.TXT) required to install the RFC is on the same CD
as the GUI client component. The text file is in the RFC directory.
The R/3 Administrator must install these two components by using the Transport
Management System (Transaction STMS), which is the recommended method
described in the next section. You can also create them manually by following step-
by-step procedures (see the following instructions and those in “Creating an
Authorization Profile Manually” on page A-2).

Importing the RFC and Authorization Profile


The R/3 Administrator must copy the cofiles and data files shipped with the
Ascential PACK for R/3 client CD to the appropriate directory on the SAP R/3 host
machine (see the next section, “Copying Cofiles and Data Files”).
You can then import them within SAP R/3 by doing one of the following:
• Using transaction STMS to access the Transportation Management System
• Using the tp import transport control program at the operating system
level

Copying Cofiles and Data Files


The cofiles and data files must be copied from the Ascential PACK for R/3 client
installation CD to the appropriate directory on the SAP R/3 machine before
importing them. To copy these files:
1. Log on to the SAP R/3 machine as <SID>adm where <SID> is the system
number, for example, C46.
2. Copy the K9nnnnn.P46 and K9nnnnn.P46 cofiles and R9nnnnn.P46 and
R9nnnnn.P46 data files from \trans\cofiles and \trans\data directories

2-4 PACK for SAP R/3


PACK4SAPR3.book Page 5 Thursday, January 15, 2004 11:31 AM

on the client CD to /usr/sap/trans/cofiles and /usr/sap/trans/data directories


respectively on the SAP R/3 system by doing one of the following,
depending on your network configuration:
a. Accessible file system. If the file system for your SAP R/3 system is
visible (for example, by NFS) on your DataStage client system, you can
copy the K9nnnnn.P46 and K9nnnnn.P46 cofiles using a DOS command or
Windows Explorer. Copy them from trans\cofiles on the client CD to
/usr/sap/trans/cofiles/ on the SAP R/3 system.
b. Unmapped network drive. If the SAP R/3 system is accessible by the
network, but it is not mapped as a network drive on the DataStage client
system, you can use FTP in binary mode to transfer the files using the
directories specified in step a.
c. Inaccessible file system. If the SAP R/3 system is not accessible by the
network from the DataStage client system, you can mount the DataStage
client installation CD directly on your SAP R/3 system and copy the files
specified in step a. Because of differences in CD file system formats on
UNIX platforms, the pathnames may vary as you mount the CD as
follows:
AIX, Solaris, or LINUX Platforms. The file names are in lower case. Copy
the cofiles to their destination in upper case.
HP-UX or Tru64 Platforms. The file names and directory names are in
upper case, and the file names have ‘;1’ appended to them. Copy the
cofiles to their destination without this ‘;1’ appendage.

Importing Transport Requests


To import the transport request, the SAP R/3 Administrator can use one of the
following two methods:
Importing using Transport Management System (using Transaction STMS). To
successfully import the transport request, the SAP R/3 Administrator must do the
following:
1. Log onto the target system and client as the SAP R/3 Administrator.
2. Create a development class named ZETL that is assigned to a transport
layer according to the system landscape for your company. You can use
Transaction SE80 to create this class.
3. Enter Transaction STMS. The initial Transport Management System
screen appears.

The ABAP Plug-In 2-5


PACK4SAPR3.book Page 6 Thursday, January 15, 2004 11:31 AM

4. Choose Overview ➤ Imports. The system reads the import queues.


5. Select <SID> (or double-click <SID> in the queue column), then choose
import queue ➤ Display.
6. Choose Extras ➤ Other requests ➤ Add to add the transport request to
selected queue.
Type the transport request number in the next screen as P46K9nnnnn and
continue.
Repeat this step for the P46K9nnnnn transport request.
7. Select the transport request in the import queue, and choose Request ➤
Import.
Type the client number to which to import, and continue.
Repeat this step for each transport request.
8. Verify that the Z_RFC_DS_SERVICE RFC function module and Z_DS-
PROFILE profile are successfully imported and activated.
Importing at the operating system level. If you are importing at this level, you can
do the following (instead of the previous steps 3-7):
1. Log onto the target system and client as the SAP R/3 Administrator.
2. Create a development class named ZETL that is assigned to a transport
layer according to the system landscape for your company. You can use
Transaction SE80 to create this class.
3. Add the transport request to the buffer using the following syntax:
tp addtobuffer <transport request> <SID>
For example, tp addtobuffer P46K9nnnnn DD1
4. Import the transport request using the following syntax:
tp import <transport request> <SID> client=<NNN>
where <NNN> is a client number like 800.
For example, tp import P46K9nnnnn DD1 client=800
5. Verify that the Z_RFC_DS_SERVICE RFC function module and Z_DS-
PROFILE profile are successfully imported and activated.

2-6 PACK for SAP R/3


PACK4SAPR3.book Page 7 Thursday, January 15, 2004 11:31 AM

Creating a DataStage Job


To create and build your job:
1. Create a DataStage job using the DataStage Designer.
1. Create a new ABAP Extract stage.
2. Define the properties for the ABAP Extract stage.
3. Define the SAP R/3 connection information.
4. Select the data transfer method.
5. Specify the ABAP program parameters.
6. Define the data to be extracted.
7. Compile the job.
These tasks are described in detail in the following sections.

Defining Stage Properties


When you edit the ABAP Extract plug-in stage, the ABAP Extract Stage dialog
opens with the Stage page on top. The ABAP plug-in for release 5 of the Ascential
PACK for SAP R/3 uses this icon:

The stage dialog has two pages:


• Stage. Displays the name of the stage you are editing. You can define
connection and logon details to a target and source SAP system. You can
also describe the purpose of the stage in the Description field of the
General tab, which appears by default. This page is similar to that in other
R/3 and BW plug-ins except the connection information appears at the
stage level, not at the link level.
The NLS tab defines a character set map to use with the stage, if NLS is
enabled. For details, see “Defining Character Set Maps” on page 2-8.
• Output. Specifies the method and details to connect to the SAP R/3 system,
the options that determine how the ABAP extraction program is run,
options to configure how the extracted data is transferred to the DataStage

The ABAP Plug-In 2-7


PACK4SAPR3.book Page 8 Thursday, January 15, 2004 11:31 AM

server system, and lets you view meta data for the SAP R/3 fields. For
details, see “Defining Output Properties” on page 2-8.

Defining Character Set Maps


You can optionally define a character set map for a ABAP Extract stage using the
NLS tab.
You can change the default character set map defined for the project or the job by
selecting a map name from the list. This tab also has the following components:
• Show all maps. Lists all the maps supplied with DataStage. Maps cannot
be used unless they are loaded using the DataStage Administrator.
• Loaded maps only. Displays the maps that are loaded and ready to use.
• Use Job Parameter… . Lets you specify a character set map as a parameter
to the job containing the stage. If the parameter has not yet been defined,
you are prompted to define it from the Job Properties dialog.
• Allow per-column mapping. Lets you specify character set maps for indi-
vidual columns within the table definition. If per-column mapping is
selected, an extra property, NLS Map, appears in the grid in the Columns
tab.
For more information about NLS or job parameters, see DataStage documentation.

Defining Output Properties


To define output properties, click the Output page. It has the name of the output
link in Output name. Click Columns… to see a list of columns for the link.
This page also has the following tabs:
• General. Specifies the method and details to connect to SAP R/3 (see “The
Output General Page” on page 2-9).
• Data Transfer Method (formerly the Runtime page). Specifies how to
transfer extracted data to the DataStage server (see “Selecting the Data
Transfer Method” on page 2-13).
• ABAP Program (formerly the ABAP/4 Options page). Specifies the param-
eters for ABAP program operations and lets you see the status of the
program. You can also log on to SAP to review code without leaving the
application by clicking ABAP Workbench (formerly Launch SAP GUI…
on the Output General page). For details about this page, see “The Output
ABAP Program Page” on page 2-18.

2-8 PACK for SAP R/3


PACK4SAPR3.book Page 9 Thursday, January 15, 2004 11:31 AM

• Columns. Lets you view the meta data of the SAP R/3 field that corre-
sponds to the currently selected DataStage column. You can also
synchronize and validate columns (see “The Output Columns Page” on
page 2-46).

The Output General Page


The General page opens by default:

This page works similarly to the General tab of the BW Load Input page for the
BW Load plug-in (for details, see Ascential PACK for SAP BW, 00D-025DS60). You
can use the DataStage server to access the list of connections to SAP, which is stored
in a file.
By default, a newly created stage uses the connection most recently selected by the
current user for other stages of this type. Since the number of R/3 systems accessed
by a given DataStage installation is likely to be small, the default connection is
generally correct.
Enter the following information on the General tab:
• DataStage Connection to SAP. The DataStage connection to the SAP R/3
system that is defined on the DataStage server machine and shared by all
DataStage users connected to that machine. The fields in this area are read-
only and are obtained from the connection that you selected.

The ABAP Plug-In 2-9


PACK4SAPR3.book Page 10 Thursday, January 15, 2004 11:31 AM

Name. The name of the selected connection to the SAP R/3 system that
generates the data to be extracted.
Select… . Click to choose a DataStage connection to the SAP R/3 system.
The selected connection provides all needed connection and default logon
details that are needed to communicate with the corresponding SAP R/3
system. This opens the Select DataStage Connection to SAP dialog (see
“Defining SAP Connection and Logon Details” on page 2-10). You can add
new entries here and modify or delete existing entries from the list of
connections.
Description. Additional information about the selected connection.
Application Server. The name of the host system running R/3.
System Number. The number assigned to the SAP R/3 system used to
connect to R/3.
• SAP Logon Details. User Name, Client Number, and Language default to
the values last entered by the current user for the ABAP Extract stage.
User Name. The user name that is used to connect to SAP.
Password. A password for the specified user name.
Client Number. The number of the client system used to connect to SAP.
Language. The language used to connect to SAP.
• Description. Enter text to describe the purpose of the stage.
• Validate Stage… (formerly Job Validation). Click to open the Validation
Stage dialog. The validation process begins automatically when the dialog
opens.
See “Validating Run-time Jobs” on page 2-49 for details.
Extraction Object, Review Code, and Launch SAP GUI (formerly on the General
page) are now on the ABAP Program page (see“The Output ABAP Program Page”
on page 2-18).

Defining SAP Connection and Logon Details


Click Select on the General tab of the Output page to choose a DataStage
connection to the SAP R/3 system. This opens the Select DataStage Connection to
SAP dialog.

2-10 PACK for SAP R/3


PACK4SAPR3.book Page 11 Thursday, January 15, 2004 11:31 AM

The selected connection provides all needed connection and default logon details
to communicate with the corresponding SAP R/3 system. You can add new entries
here and modify or delete existing entries from the list of connections.
Although SAP logon details for the ABAP Extract plug-in are not stored on the
server, the plug-in can share SAP connections created by other R/3 plug-ins you
create, such as the BAPI, IDoc Load, and IDoc Extract plug-ins.
New…, Properties…, and Remove are administrative connection operations.
They let you manage the list of connections that is maintained on the DataStage
server machine from this dialog:
• New… . Click to open the Connection Properties dialog, which lets you
define the properties for the new connection. It is added to the list of
connections. Here you can specify SAP connection details and default
logon details for the new connection. The Connection and Logon Details
page appears by default (see the following section).

The ABAP Plug-In 2-11


PACK4SAPR3.book Page 12 Thursday, January 15, 2004 11:31 AM

• Properties… . Click to open the Connection and Logon Details page of the
Connection Properties dialog, showing the properties of the selected
connection. This is the same dialog that is opened when you click New…,
but in this context the connection name is read-only (see “Defining Connec-
tion Properties” on page 2-10).
• Remove. Click to delete the selected connection after your confirmation.
These administrative operations function similarly to the corresponding buttons
on the DataStage Connections to SAP page of the DataStage Administrator for
SAP utility (see “The Administrator for SAP Utility,” beginning on page 6-1).

Defining Connection Properties


The Connection Properties dialog has the Connection and Logon Details page.
Use the dialog to create a new SAP connection, or to view the properties of the
currently selected connection, depending on whether you open it using New… or
Properties… on the Select DataStage Connection to SAP dialog respectively.
To define the DataStage connection to SAP:
1. Specify the SAP connection details:
• Application Server. The name of the R/3 server.
• System Number. The system number of the R/3 instance.
• Router String. Optional. The string used to connect to the remote SAP
server. If you use a SAP router to access your system, you must include a

2-12 PACK for SAP R/3


PACK4SAPR3.book Page 13 Thursday, January 15, 2004 11:31 AM

fully-qualified node name for your RFC server machine in the SAP router
table.
2. Select Use load balancing to use load balancing when connecting to R/3.
The Application Server and the System Number controls are replaced by
Message Server, System ID, and Group controls so that connection
details specific to load balancing can be entered (see “Load Balancing” on
page 2-13).
3. Specify the CPI-C Connection details. These values are read-only and
match the entries in SAP Connection Details if Use load balancing is
selected. Otherwise, the fields are editable and define a non-load
balancing connection that can be used at run time for CPI-C processing.

Load Balancing
Select Use Load balancing on the Connection and Logon Details page of the
Connection Properties dialog (see page 6-5) to balance loads for SQL queries
when connecting to the R/3 system. Load balancing at design time works as
follows:
• R/3 lets you make logon connections through a message server. The
message server uses an algorithm that considers server workload and
availability to choose an appropriate application server to handle the
logon.
• When connections are configured, you can choose a load balancing connec-
tion to a message server rather than a specific R/3 instance to retrieve and
validate IDoc types and meta data, for example.

Note: Load balancing cannot be used at run time.

Selecting the Data Transfer Method


Use the Data Transfer tab on the Output page to transfer data. It is similar to the
Runtime page for the previous version of the ABAP Extract plug-in. It shows only
controls relevant for the selected transfer method.

The ABAP Plug-In 2-13


PACK4SAPR3.book Page 14 Thursday, January 15, 2004 11:31 AM

This page displays the selected Data Transfer Method, which controls how to
process the dataset. Choose one of three data transfer methods:
• CPI-C Logon. Default. Transfers data from the SAP application server
directly to the DataStage server, without using FTP or other utilities.
The Local File option has no effect, that is, no flat files are generated in the
SAP application server. The SAP user name must be the CPI-C user type.
• FTP. Uses the FTP service in the SAP application server to get the dataset
from the SAP application. (The buttons for ABAP program loading in the
former version are now on the ABAP Program page. See page 2-18 for
more information.)
Choosing FTP enables the FTP Logon fields, which are required. Enter the
pathname for the data file on the remote system to use during run time in
Path of Remote File. This pathname is used to access the data file when you
may not have direct access to the data file. The data file contains the data
extracted from R/3. This file is transferred back to the DataStage server
system.
Alias for Remote Path. Optional. You cannot access the data file using the
pathname in the Path of Remote File field, for example, if the system
administrator has restricted access to a directory in the path. In this case, use
a relative pathname to access the file. If the DataStage server cannot transfer
the data file using the path specified in the Path of Remote File field, it uses
the path in the Alias for Remote Path field to access the file.

2-14 PACK for SAP R/3


PACK4SAPR3.book Page 15 Thursday, January 15, 2004 11:31 AM

Alternate Host Name. Specify an alternate host name for FTP to use as an
alias to access files on the application server’s machine (where the ABAP
program runs). This is necessary if, for security reasons, FTP cannot access
files on the application server’s machine using the application server name.
Run the program as a background process. Specifies whether to run ABAP
programs in the background (the default is cleared). For details, see
“Running Programs in the Background” on page 2-15.
• Local File. Use this option if the file containing the dataset cannot be
accessed using FTP. The resultant dataset file can be placed on the
DataStage server machine by the Administrator, and Datastage can then
access it from there using the Local File option.
Choosing this option enables the Local Data File field and Browse… . Type
a name for the data source file, or click Browse… to search for a name. SAP
R/3 needs write access to this file.

Note: When you switch between CPI-C and FTP or vice versa after gener-
ating ABAP code, a message tells you to regenerate the code.

• Use SAP Logon Details. If selected for a CPI-C data transfer method, User
Name and Password are read-only and display the corresponding values
from the General page.

Running Programs in the Background


This option on the Data Transfer Method tab of the Output page specifies whether
to run ABAP programs in the background (the default is cleared). For details about
installing an RFC manually for this functionality, see page A-5.
To run ABAP programs in the background:
1. Select Run the program as a background process for FTP data transfers from
the Output Data Transfer Method page. (This check box does not appear for
CPI-C transfers). This functionality lets you run ABAP programs indefinitely
as background processes without being timed out.

The ABAP Plug-In 2-15


PACK4SAPR3.book Page 16 Thursday, January 15, 2004 11:31 AM

2. The plug-in runs the generated ABAP program as a background process


within R/3. You must provide a unique variant name if you want to pass
job parameters to the ABAP program. This facilitates the creation of a
variant that can be added to the background process during run time.

2-16 PACK for SAP R/3


PACK4SAPR3.book Page 17 Thursday, January 15, 2004 11:31 AM

This variant is automatically deleted after the associated job completes or


aborts.

3. The run time implements BAPI calls to:


• Log on to the R/3 system as an external user using the XBP interface
• Create a background process job
• Create a variant if necessary
• Add the ABAP program and corresponding variant to the job
• Schedule the job to run immediately
• Check the status of the background process job
• Delete the job and the associated variant after the job completes (or aborts)
• Log off from R/3
4. To monitor the status of the job from Scheduled to Finished, use the Job
Overview screen in the SAP GUI.

The ABAP Plug-In 2-17


PACK4SAPR3.book Page 18 Thursday, January 15, 2004 11:31 AM

The job name is the same as the name of the generated ABAP program. This
helps you easily identify the job.

Your job disappears from the list after the ABAP run time deletes it.

The Output ABAP Program Page


This page handles all the properties and operations related to the ABAP program.

2-18 PACK for SAP R/3


PACK4SAPR3.book Page 19 Thursday, January 15, 2004 11:31 AM

This includes program generation (formerly done using the Extraction Object
dialog on the Access General page), editing, saving, and validation.

This page has the following components:


• ABAP Program ID. Enter the program ID as you sequentially define
required properties on the tabs on the Output page, from left to right.
Otherwise, you see Program needs ID in ABAP Program Development
Status.
• Generation Method includes three ways to create the ABAP program:
– Build SQL query. This option (the default) lets you define the data to be
extracted by constructing a SQL query. (It uses the same SQL Builder
interface as that for other stages.) It does this by causing Build to open the
Build Open SQL Query dialog, described in“Building SQL Queries” on
page 2-22.
–Build extraction object. When you select this option, Build opens the
Build Extraction Object dialog, which disables Options. This method
gives you backward compatibility (the dialog is similar to the Extraction
Object dialog in the former version). This dialog includes OK and Cancel
(instead of just Close as in the previous version). Generate ABAP/4 and
Connect no longer exist. See “Building Extraction Objects” on page 2-31
for more information.

The ABAP Plug-In 2-19


PACK4SAPR3.book Page 20 Thursday, January 15, 2004 11:31 AM

– Enter program as text. If selected, Build… , Options… , and Generate


Program are disabled. In this case, you can open ABAP Editor (using Edit
Program…) to create the program by hand. If you edit a generated
program, Generation Method changes to Enter program as text.
• Build… . Opens a dialog to build an extraction object or SQL query,
depending on the selection generation method.
Click to open the Build Extraction Object dialog if Build extraction object
is selected as the Generation Method. (Options is disabled.)
The Build Extraction Object dialog is similar to the Extraction Object
dialog in the former version, except it uses OK and Cancel (instead of
merely Close), and Generate ABAP/4 and Connect no longer exist (see
“Building Extraction Objects” on page 2-31).
Opens the Build Fully Generated Query dialog if Build SQL Query is
selected as the Generation Method. See “Building SQL Queries” on
page 2-22 for details.
• Options… . Lets you further customize the generated ABAP by opening
the Program Generation Options dialog, described on page 2-45. This
option is only available if Build SQL Query is selected.
• Load Method. Specifies how to load the ABAP program. The options
depend on the data transfer method:
– DataStage job loads the program. (Program ready for runtime load
appears as status after program generation.)
– Current user loads the program from this dialog box.
– SAP administrator loads the program manually.
• Delete the program from R/3 after it runs.
• Run Method. Specifies how to run the ABAP program. Options depend on
the data transfer and the load methods.
– DataStage job runs the program. Selected by default. If the data transfer
method is CPI-C, only this option appears. The same limitation applies if
you select DataStage job runs the program as the load method.
– SAP administrator runs the program manually. Appears if DataStage
job loads the program is not selected.

2-20 PACK for SAP R/3


PACK4SAPR3.book Page 21 Thursday, January 15, 2004 11:31 AM

• ABAP Program Development Status. Displays the following read-only


status for the ABAP program you are developing so you know the next
step. For example:
– SQL query needs to be built. Appears when you specify an ID, but the
selected Generation Method is Build SQL query, and you need to build
the query.
– Program not used for “Local File” data transfer method. Appears when
you select Local File as the data transfer method (everything on this page
is disabled).
– Program needs to be generated. Appears when the program is not yet
generated, or if the relevant stage properties differ since the last genera-
tion of the program (program generation formerly handled in the
Extraction Object dialog).
– Program needs to be loaded to R/3. Appears if you decline the system
offer to load the program.
– Program needs to be saved to a file. Appears if you decline the system
offer to save the program.
– Program loaded to R/3. Appears after you load the program to R/3.
– Program saved to file <file path>. Appears after you save the program to
a file.
– Program ready for runtime load. Appears after program generation
when Load Method is DataStage job loads the program.
– Program needs ID. Appears if you need to provide ABAP Program ID.
• Generate Program. Generates the ABAP program, disabling the button.
Build and Options remain enabled, if Generation Method is Build SQL
Query. If you click Build or Options, and if the system knows about any
program edits since generation-time, you must first click Clear Program to
delete the existing edited program to use any changes to regenerate the
program.
After generating the program, you can load the program to R/3 or save it to
a file, depending on the Load Method setting (unless the setting is
DataStage job loads the program). If the program needs to be generated,
the development status is Program needs to be generated.
If you save or load the program, the development status is Program saved
to file <file path> or Program loaded to R/3, as appropriate.

The ABAP Plug-In 2-21


PACK4SAPR3.book Page 22 Thursday, January 15, 2004 11:31 AM

The button is disabled after you generate the program, and all the
previously disabled buttons are enabled, including Edit Program and Editor
Options.
• Load Program to R/3. Appears if you need to load the program to R/3.
• ABAP Workbench. Opens the SAP GUI, replacing Launch SAP GUI on the
General page in the former version.
• Save Program as File… , Clear Program, and Save Program as File… are
the other buttons you see, if appropriate.
• Edit Program… . Lets you edit the program and save the modifications.
• Editor Options… . Opens a dialog that lets you indicate what editor to use
to edit the program.
• Validate Program. Verifies the program syntax. If errors exist, the ABAP
editor for the stage opens and highlights the error, similar to validation
done from the editor.
• ABAP Workbench invokes the SAP GUI, (replacing Launch SAP GUI on
the General page in the previous version).

Note: When you save the ABAP program to R/3, you see a warning if a program
with the same name already exists on the R/3 system (you can overwrite
the program).

You can also click Load Program to R/3, Save Program as File… , Clear program,
Edit program… , Editor options… , or Validate program if appropriate.

Building SQL Queries


You can also define the data to be extracted by constructing a SQL query. It uses
the same SQL Builder interface as for other plug-ins, for example, the Oracle Call
Interface plug-in. (See “Building Extraction Objects” on page 2-31 for information
about building extraction objects.)

2-22 PACK for SAP R/3


PACK4SAPR3.book Page 23 Thursday, January 15, 2004 11:31 AM

If you select Build SQL Query as the Generation Method on the Output ABAP
Program page and click Build… , the Build Open SQL Query dialog opens with
the Tables page on top:

The Build Open SQL Query dialog contains the Tables (the default), Select,
Where, Having, Order By, and SQL pages.
The generated SQL is customized to match the specific SQL syntax of the source
database and conforms to the SAP proprietary Open SQL syntax used by ABAP
programs.
The first five pages of the Build Open SQL Query dialog correspond to a clause in
the generated SQL. The SQL page shows the complete SQL statement as
determined by the settings in the preceding pages.

Build Open SQL Query Tables Page


Use to select a table name from the available tables. This page includes the All
Tables (the default) and Search for Table tabs.

The ABAP Plug-In 2-23


PACK4SAPR3.book Page 24 Thursday, January 15, 2004 11:31 AM

Click > and the selected table appears in the Selected tables list.
Join Type determines the type of join (see Join to -> buttons) between two or more
specified tables as follows. The tree representation of join nesting is described later.
• Inner >. Adds the table using inner join.
• Left >. Creates other types of joins.
• Right >. Creates other types of joins.
Click < to remove the selected table or join.
Click << to remove all tables and joins.
Show Related Tables displays tables related to the selected table through foreign
or primary keys. The related tables are listed in two folders that appear in a tree
structure below the selected table. These folders are labelled Referenced Tables
and Referring Tables. When related tables are added to the join tree (Selected
Tables), default join conditions are created automatically based on the key
relationships.
The join condition and join type of the selected join are shown in controls below
the join tree. Through these controls, you can modify the properties of the join.

Build Open SQL Query Tables All Tables Page


The All Tables page includes the Available Tables control, which shows the R/3
tables organized according to the SAP application hierarchy. This hierarchy groups
the tables into folders and subfolders based on the business category of the data
that is contained in the tables. (The organization of these folders is the same as that
in the SAP R/3 Module Tree dialog in the former version of this plug-in.)
Each table entry in Available Tables includes both the name and the Short Text for
the table to help you identify the entries.

2-24 PACK for SAP R/3


PACK4SAPR3.book Page 25 Thursday, January 15, 2004 11:31 AM

Build Open SQL Query Tables Search for Table Page


This page lets you perform a simple query to search for the desired table. The
tables resulting from such a search can be added one by one to the join tree for the
query (shown in Selected Tables).
The following example displays an inner join:

The ABAP Plug-In 2-25


PACK4SAPR3.book Page 26 Thursday, January 15, 2004 11:31 AM

Build Open SQL Query Select Page


Use to select the specific columns to be included in the SQL statement.

2-26 PACK for SAP R/3


PACK4SAPR3.book Page 27 Thursday, January 15, 2004 11:31 AM

The fields in Available columns include the Short Text for each field to help you
identify the ones you want.

Select the desired columns from Available columns.


Click > and the selected columns appear in the Selected columns list.
Click >> to add all available columns.
The Agg menu includes these options: SUM, AVE, MIN, MAX, and COUNT. The
menu lets you add new columns that apply one of these functions to the selected
available column.
Columns in the Selected columns list can be reordered using the MOVE UP and
MOVE DOWN buttons.
Columns in the Selected columns list can be removed using the < (remove
selected) and << (remove all) buttons.

The ABAP Plug-In 2-27


PACK4SAPR3.book Page 28 Thursday, January 15, 2004 11:31 AM

Click Find to open a dialog that lets you search for a field in Available columns or
Selected columns, depending on where the focus is when you click Find.
Use the Key column checkbox to specify which columns should be used as key
columns for the link.

Build Open SQL Query Where Page


Use to construct the SQL WHERE clause. Operators include: =, <>, <, <=, >, >=,
BETWEEN, NOT BETWEEN, IN, and NOT IN. For example:

When you click OK, the entire query is validated. The plug-in validates the syntax
of the Where and Having tabs and verifies the presence of at least one table and at
least one selected column. Errors are reported as warnings, but you can exit the
dialog without fixing the errors.
You can use DataStage job parameters on the Where tab of the SQL Builder by
typing the parameter reference into one of the grid boxes. In this case, the
parameter references are in the standard format that is used elsewhere in
DataStage (#PARAM#, where PARAM is the name of the job parameter).

2-28 PACK for SAP R/3


PACK4SAPR3.book Page 29 Thursday, January 15, 2004 11:31 AM

For example, the #LANG_LOW# and #LANG_HIGH# parameters specify the


range of values for the Where clause. You can click the SQL page to see the
conversion of parameter references to the format required by ABAP (see “Build
Open SQL Query SQL Page” on page 2-30). This conversion is done automatically
when the SQL is generated.
See “Using Job Parameters” on page 2-43 for information about using DataStage
job parameters with extraction objects.

Build Open SQL Query Having Page


Use to construct the SQL HAVING clause.
When you click OK, the entire query is validated. The plug-in validates the syntax
of the Where and Having tabs and verifies the presence of at least one table and at
least one selected column. Errors are reported as warnings, but you can exit the
dialog without fixing the errors.
You can use DataStage job parameters on the Having tab of the SQL Builder by
typing the parameter reference into one of the grid boxes. In this case, the
parameter references are in the standard format that is used elsewhere in
DataStage (#PARAM#, where PARAM is the name of the job parameter). You can
click the SQL page to see the conversion of parameter references to the format
required by ABAP (see “Build Open SQL Query SQL Page” on page 2-30). This
conversion is done automatically when the SQL is generated.
See “Using Job Parameters” on page 2-43 for more information about using
DataStage job parameters with extraction objects.

Build Open SQL Query Order By Page


Use this page to construct the SQL ORDER BY clause.
The Available columns are the ones selected on the Select page. The columns listed
higher in the Order by columns list have a higher precedence in the resulting sort
order. You can modify the precedence by using the MOVE UP and MOVE DOWN
buttons.
To change Ascending to Descending (or vice versa), click Ascending in the list
control. A combo box appears that allows changes.

The ABAP Plug-In 2-29


PACK4SAPR3.book Page 30 Thursday, January 15, 2004 11:31 AM

Build Open SQL Query SQL Page


Use this page to display the SQL statement generated from the input to the
previous pages. The statement conforms to the SAP proprietary Open SQL syntax
used by ABAP programs.
The meta data is obtained directly from the R/3 system from which the data is
extracted, for example:

The Open SQL statement is like the statement that appears in the generated ABAP
program, but without the INTO clause.
The SQL page displays the conversion of parameter references to the format
required by ABAP. This conversion is done automatically when the SQL is
generated.
Click OK to accept the SQL statement or Cancel to discard all the input.

2-30 PACK for SAP R/3


PACK4SAPR3.book Page 31 Thursday, January 15, 2004 11:31 AM

Building Extraction Objects


You can define the data to be extracted by building an extraction object when you
use Build Extraction Object as your generation method (see “The Output ABAP
Program Page” on page 2-18 for more information about this page).
(See “Building SQL Queries” on page 2-22 for information about building SQL
queries.)
To build the object:
1. Connect to the SAP server.
2. Search for tables.
3. Specify the SQL conditions.
4. Generate and review the SAP R/3 ABAP code.
5. Examine the extraction object meta data.
6. Upload the SAP ABAP code into SAP.
Each of these tasks is described in more detail in the following sections.

Connecting to the Server


To connect to the SAP server:
1. Enter the appropriate values on the General tab in the Output page.
2. Select Build Extraction Object as the Generation Method on the Output
ABAP Program tab.

The ABAP Plug-In 2-31


PACK4SAPR3.book Page 32 Thursday, January 15, 2004 11:31 AM

The Build Extraction Object dialog appears, for example:

Searching for Tables


Dialogs such as Extraction Object Editor, ABAP/4 Code Viewer, and so forth, have
a right-click shortcut menu and the usual buttons.
Use any of the following methods to search for an SAP table to add to the
extraction object. Click the appropriate button, choose the appropriate command
from the Extraction Object Editor shortcut menu, or use the shortcut keys.
• Add tables from Search (Alt+s)
• Add tables from Module Tree (Alt+m)
• Add tables from Logical Database (Alt+l)
Before learning how to add a table, read the next section about navigation.

2-32 PACK for SAP R/3


PACK4SAPR3.book Page 33 Thursday, January 15, 2004 11:31 AM

About Navigation
When you navigate the SAP R/3 Module Tree or the SAP R/3 Logical Database
Tree, double-click an entry to expand it. Because SAP R/3 is such a large system,
load meta data only when you want to interact with the system.
In the Extraction Object Tree, the root is the highest level and the field is the lowest
level. Since DataStage processes input as a rowset, the second level must only have
one table.
The following navigation guidelines pertain to the Build Extraction Object dialog.
Commonly executed operations have corresponding buttons and shortcut menu
commands.
• You cannot delete the root.
• You can relocate a table by selecting and dragging it to the desired location.
• A table can become a subtree of another table tree.
• You can delete tables by clicking Delete or choosing Delete from the
shortcut menu.
• You can move a column within one table, but you cannot move it outside
that table to another table.
• You cannot move columns to other columns.
• You can delete all or selected columns associated with a table by clicking
Delete, choosing Delete from the shortcut menu, or pressing Alt+e.

Adding a Table
1. Click Add Tables…, then click From Search…, or choose Add Tables from
the Extraction Object Editor shortcut menu. The R/3 Table Searching &
Adding dialog box appears.
2. To find a table, do one of the following:
• Type a full or partial table name using the * or % wildcard in the Table
Name field. For example, you can type t00, t00*, or t00%. You can use * or %
interchangeably in one table name.
• Type a description in the Table Description field. You can type a full or
partial name using a wildcard (* or %). This name is case-sensitive.
• Create a combination of table name and table description values. Use the
AND or OR option buttons to specify the Boolean relationships among the
selection criteria.

The ABAP Plug-In 2-33


PACK4SAPR3.book Page 34 Thursday, January 15, 2004 11:31 AM

Click Search. The Search Results for Table grid gets filled in. The dialog box
name is replaced by “Searching result: n entries found.”

2-34 PACK for SAP R/3


PACK4SAPR3.book Page 35 Thursday, January 15, 2004 11:31 AM

3. Do one of the following:


• Highlight a table, then click Table Definition, or choose Table Definition
from the Table List Operations shortcut menu. The Table definition dialog
appears:

• Highlight one or more table columns, then click Add to EO Tree or choose
Add to Extraction Object from the Table Definition Operations shortcut
menu. The columns are added to the Build Extraction Object dialog. Close
the Table definition dialog.
• Click Print, or choose Print from the Table List Operations shortcut menu
to send the search results to the printer.
• Click Save As, or choose Save As from the Table List Operations shortcut
menu to save the search results to a flat file.

The ABAP Plug-In 2-35


PACK4SAPR3.book Page 36 Thursday, January 15, 2004 11:31 AM

4. To view the table contents, click Table Content, or choose Table Content
from the Table List Operations shortcut menu. The Table Contents
dialog box appears:

The table content list on the left under Select Field for Selection Criteria
displays the associated column details. You can search for a string in the field
name or a description using the Find What field, the Direction list, and Find.
Using Find What is especially helpful for searching large tables.
You must specify selection criteria in order to view the table contents.
a. To add a single value, select the column for which you are specifying the
condition. Select a comparison operator from the Operator list under
Single Value Operation.
Type the comparison value in the Value field. See “Selection Criteria Data
Types” on page 2-47 for information on entering values.
Click Add Single Value. The selection criteria is added to the SQL Selec-
tion Criteria box.
b. You can add a range of values in the same manner. Select the column for
which you are specifying the condition. Select INCLUDE or EXCLUDE
from the Operator list under Range Operation. Type the first value in the

2-36 PACK for SAP R/3


PACK4SAPR3.book Page 37 Thursday, January 15, 2004 11:31 AM

range in the first Range of Values field and the last value in the range in
the second field. See “Selection Criteria Data Types” on page 2-47 for infor-
mation on entering the values. Click Add Range. The criteria is added to
the SQL Selection Criteria box.
Selecting INCLUDE or EXCLUDE produces output similar to the
following:
– INCLUDE valueA, valueB will get condition
as field >= min(valueA, valueB)
AND
field <= max(valueA valueB),
– EXCLUDE valueA, valueB will get condition
as field > max(valueA, valueB)
OR
field < min(valueA, valueB),
You can create a complex condition by using a combination of single and
range values. Use the AND or OR option buttons to specify the Boolean
relationships among the selection criteria.

Note: If you make an error in specifying the condition, you can remove
the condition from the SQL Selection Criteria box by clicking
Clear Condition. This removes the entire selection criteria defini-
tion from the box. You can also manually edit the SQL condition.

c. If you do not specify any fields, this plug-in by default loads every field
back into the Fields for Selection list. If you use RFC to connect to the SAP
application server and the table is large (that is, the record length is large),
then you can load nothing back. In this case, make sure that the record you
want to view is less than 512 bytes long. After you specify the selection

The ABAP Plug-In 2-37


PACK4SAPR3.book Page 38 Thursday, January 15, 2004 11:31 AM

criteria, click View Contents. A Table Content dialog showing the table
contents appears. The rows are displayed in 500-row lots.

Select one of the following using the buttons or commands on the Table
Content Operations shortcut menu:
• Click Save As… to save the table contents to a flat file using the Save As
dialog box.
• Click Print… to send the table contents to the printer.
Close this dialog and the Table Content dialog to go to the R/3 Table
Searching & Adding dialog box.
5. After you verify that you have the correct table, select the table from the
grid in your R/3 Table Searching & Adding dialog box, and click Add to
Extraction… .

2-38 PACK for SAP R/3


PACK4SAPR3.book Page 39 Thursday, January 15, 2004 11:31 AM

Select one of the following using the buttons or the shortcut menu on the R/3
Table Searching & Adding dialog box:
• Select Save As to save the table columns to a flat file using a standard Save
As dialog box.
• Select Print to send the table columns to the printer.
6. Close the R/3 Table Searching & Adding dialog box. The Build Extrac-
tion Object dialog now shows the added table.

Navigating the SAP R/3 Module Tree


1. On the Build Extraction Object dialog, click Add Tables…, then From
Module Tree…, or choose Navigate Module Tree from the Extraction Object
Editor shortcut menu. The SAP R/3 Module Tree dialog appears:

2. Double-click any entry to expand it to the next level. Click Expand or


Collapse to expand or collapse the entire tree.

The ABAP Plug-In 2-39


PACK4SAPR3.book Page 40 Thursday, January 15, 2004 11:31 AM

3. The lowest level of any node in a module tree is a table. Right-click a table
to see its table definition or table content, or add it to the Build Extraction
Object dialog. For more information about the table definition and table
content, see “Adding a Table” on page 2-33.

Note: You can select multiple fields or multiple tables at the same time, but you
cannot select both fields and tables at the same time.

Navigating the SAP R/3 Logical Database Tree


1. On the Build Extraction Object dialog, click Add Tables…, then From
Logical Database…, or choose Navigate Logical Database from the Extrac-
tion Object Editor shortcut menu. The SAP R/3 Logical Database Tree dialog
appears:

2. Double-click any entry to expand it to the next level. Click Expand or


Collapse to expand or collapse the entire tree.

2-40 PACK for SAP R/3


PACK4SAPR3.book Page 41 Thursday, January 15, 2004 11:31 AM

3. The lowest level in the logical database tree is a table. Right-click a table
to see its table definition or table content, or add it to the Build Extraction
Object dialog. For more information about the table definition and table
content, see “Adding a Table” on page 2-33.

Note: You can select multiple fields or multiple tables at the same time, but you
cannot select both fields and tables at the same time.

Specifying the SQL Condition for Extraction Objects


If you added tables to the extraction object, you can return to the Build Extraction
Object dialog to specify the SQL condition. To do this:
1. Click SQL Condition…, or choose Specify SQL Condition from the Extrac-
tion Object Editor shortcut menu. The SQL Condition Builder dialog box
appears:

2. Select a column from the Table list. You can search for a string in the field
name as well as a description using the Find What field, the Direction

The ABAP Plug-In 2-41


PACK4SAPR3.book Page 42 Thursday, January 15, 2004 11:31 AM

list, and Find. Using Find What is especially helpful for searching large
tables.
3. Specify a single value for a field, or use the list to specify job parameters.
Click Add Single Value to add the selection criteria to the ABAP SQL
Condition box.
4. Specify a range of values for a field, or use the drop-down list to specify
job parameters. Click Add Range to add the selection criteria to the
ABAP SQL Condition box.
5. Click OK to add the condition to the Build Extraction Object dialog. You
can see the condition statement at the end of the table description in the
tree.
6. If you create a join condition, specify the SQL condition for the child
table. The parent table is displayed in the Join Table list. Double-click the
parent table to list its fields in the Join Field list.
7. Select the column name from the Table list. Select the column on which
you are doing the join from the Join Field list. Click Add Relationship to
add the condition to the ABAP SQL Condition box.
You can specify additional columns for the join condition. Select the columns
in the same manner, and click either AND or OR to specify the Boolean
condition.
8. Click OK to add the join condition to the Build Extraction Object dialog.
The condition is added to the table description in the tree.

Note: If you make an error in specifying the condition, you can remove the
condition from the ABAP SQL Condition box by clicking Clear Condi-
tion. This removes the entire selection criteria definition from the box.
You can also manually edit the SQL condition. You can then review
your SQL statement by clicking View SQL.

2-42 PACK for SAP R/3


PACK4SAPR3.book Page 43 Thursday, January 15, 2004 11:31 AM

9. Highlight any entry, and click Property… on the Build Extraction Object
dialog to look at its properties and values in the Properties window.

Using Job Parameters


To use DataStage job parameters from within the ABAP code that is generated
from the specified extraction object, you need to perform certain tasks when
designing your job (see “The Output ABAP Program Page” on page 2-18 for
information about using Build Extraction Object as your generation method).

The ABAP Plug-In 2-43


PACK4SAPR3.book Page 44 Thursday, January 15, 2004 11:31 AM

1. Define the necessary job parameters in the DataStage Designer by choosing


Edit ➤ Job Properties. Enter job parameters on the Parameters page.

In this example, LANG_LOW and LANG_HIGH are defined as job parameters


for this extraction job.

2-44 PACK for SAP R/3


PACK4SAPR3.book Page 45 Thursday, January 15, 2004 11:31 AM

2. You can use job parameters in the SQL Condition Builder dialog by
using the syntax: DS_JOB_PARAM@job_parameter. In the following
example, LANG_LOW and LANG_HIGH specify a range of values in the
SQL condition. Likewise, you can use them in a single value or a join
operation.
Click Add Range to add the range condition to the ABAP SQL Condition box.

3. Generate the ABAP code. Save and compile the job when your job design
is complete. When you run this job, DataStage prompts for values for
each of the job parameters that are defined for this job. At run time, these
values are passed to the ABAP program that performs the data extraction
from SAP R/3.

The Program Generation Options Dialog


Click Options on the ABAP Program tab of the Output page to open the Program
Generation Options dialog. It lets you control how the ABAP program is

The ABAP Plug-In 2-45


PACK4SAPR3.book Page 46 Thursday, January 15, 2004 11:31 AM

generated by letting you enter lines of ABAP code that are automatically inserted
into the generated program.
This lets you regenerate the program with all your custom insertions after you
modify the SQL query.
This dialog has the following components:
• Additional Program Header Comments
• Additional Commands to Execute just before the Program Starts
• Additional Commands to Execute just before the Program Exits
• Set as Default Options. If selected, the text in the three edit controls
appears as the defaults for these controls when the current user creates new
ABAP Extract stages.

The Output Columns Page


This page contains the column definitions for the data being output on the chosen
link. For details about column definitions, see your DataStage documentation.
The list of columns is automatically synchronized with the SQL query (or the
extraction object) whenever you change the set of extracted tables and fields that
is specified in the SQL query (or extraction object). But since you can modify the
columns from the DataStage Designer, Synchronize Columns is useful.
In case columns are accidentally deleted or modified, Synchronize Columns lets
you align the column list with the extraction object.

Completing the Job


Complete the definition of the other stages in your job according to normal
DataStage procedures. Compile and run the job.

SAP R/3 Table Type Support


The DataStage PACK for SAP R/3 supports the following tables types and views:
• Transparent Table (SAP Table Type: TRANSP)
• Cluster Table (SAP Table Type: CLUSTER)
• Pool Table (SAP Table Type: POOL)
• View Table (SAP Table Type: VIEW), which supports these views:

2-46 PACK for SAP R/3


PACK4SAPR3.book Page 47 Thursday, January 15, 2004 11:31 AM

– Database View (View Type: D)


– Projection View (View Type: P)

SAP R/3 Data Type Support


The tables in the following sections document the support for SAP R/3 data types.

Selection Criteria Data Types


When you define selection criteria in the SQL Condition Builder, you must type the
values as specified in the following table:

SAP R/3
Data Type Description Examples
ACCP Posting period YYYYMM Type the value in single quotation marks.
Type a year and month in the format
YYYYMM, for example, ‘199906’.
CHAR Character strings Type the value in single quotation marks,
for example, ‘xyz’.
CLNT Client Type the value in single quotation marks,
for example, ‘800’ (3-digit).
CUKY Currency key, referenced by Type the value in single quotation marks,
CURR fields for example, ‘USD’ for US dollars, or
‘DEM’ for German marks.
CURR Currency field, stored as DEC Type a numeric value, for example,
500.00.
DATS Date field (YYYYMMDD) Type the value in single quotation marks.
stored as char (8) Type year, month, and day in the format
YYYYMMDD, for example, ‘19990623’.
DEC Counter or amount field with Type a numeric value without quotation
comma and sign marks, for example, 8.0 or –8.0.
FLTP Floating-point number, accu- Type a numeric value without quotation
rate to 8 bytes marks, for example, 8.0 or –8.0.
INT1 1-byte integer, decimal number Type a numeric value without quotation
<= 254 marks, for example, 1 or 2 through 254.
INT2 2-byte integer, only used for Type a numeric value without quotation
length field before VARC or marks, for example, 1 or 2 through 32655.
RAW
INT4 4-byte integer, decimal number Type a numeric value without quotation
with sign marks, for example, 1 or 2 through 232–1.

The ABAP Plug-In 2-47


PACK4SAPR3.book Page 48 Thursday, January 15, 2004 11:31 AM

SAP R/3
Data Type Description Examples
LANG Language key Type a 1-digit language identifier. Type
the value in single quotation marks, for
example, ‘E’ for English or ‘F’ for French.
LCHR Long character string, requires Do not use this data type for specifying a
preceding INT2 field selection condition.
LRAW Long byte string, requires Do not use this data type for specifying a
preceding INT2 field selection condition.
NUMC Character field with only digits Type a numeric value without quotation
marks, for example, 8.0 or –8.0.
QUAN Quantity field, points to a unit Type a numeric value without quotation
field with format UNIT marks, for example, 8.0 or –8.0.
RAW Uninterpreted sequence of Do not use this data type for specifying a
bytes selection condition.
TIMS Time field (hhmmss), stored as Type the value in single quotation marks.
char (6) Type hour, minutes, and seconds in the
format hhmmss, for example, ‘091024’.
UNIT Unit key for QUAN fields Type a value in single quotation marks,
for example, ‘pk’.

2-48 PACK for SAP R/3


PACK4SAPR3.book Page 49 Thursday, January 15, 2004 11:31 AM

SAP R/3 to DataStage Data Type Conversion


DataStage automatically converts SAP R/3 data types to DataStage data types. The
following table shows how the data types are converted:

SAP R/3 Data Type DataStage Data Type


CHAR(n) CHAR(n)
CLNT(n) CHAR(n)
CUKY(n) CHAR(n)
CURR(n, m) DECIMAL(n, m)
DATS CHAR(8)
DEC(n) DECIMAL(n+1, 0)
FLTP(n) FLOAT(n)
INT1(n) CHAR(n)
INT2(n) CHAR(n)
INT4(n) CHAR(n)
LANG(n) CHAR(n)
LCHR(n) VARCHAR(n)
LRAW(n) VARCHAR(n)
NUMC(n) CHAR(n)
QUAN(n, m) DECIMAL(n, m)
RAW(n) VARCHAR(n)
TIMS CHAR(6)
VARC(n) VARCHAR(n)
UNIT(n) CHAR(n)

Note: DataStage cannot handle an LRAW field that is a cluster in an SAP R/3
database.

Validating Run-time Jobs


Due to the complexity of the stage properties, you validate them when you run the
job. Use Validate Stage… (formerly Job Validation in the previous version of the
plug-in) on the General tab of the Output page to do this. Validate the stage before
you save the stage design (see “The Output General Page” on page 2-9).

The ABAP Plug-In 2-49


PACK4SAPR3.book Page 50 Thursday, January 15, 2004 11:31 AM

Validation is based on the Data Transfer Method for FTP, CPI-C, or Local File
transfers which you specify on the Data Transfer Method tab of the Output page.
(The system automatically supplies this data transfer method information for the
Runtime Validations dialog.)
Stage validation also provides a consolidated view of the stage options and
displays the various validation steps. It uses the following graphical status lights
to do this:
• Green - success
• Orange - warning
• Red - failure
In addition, system error and status messages appear beside the affected items in
the form of text tips.

FTP Transfers
This example of the Runtime Validations dialog displays successful validations
for FTP data transfers:

2-50 PACK for SAP R/3


PACK4SAPR3.book Page 51 Thursday, January 15, 2004 11:31 AM

For FTP transfers, the following operations are considered:


• Data Transfer Method. The FTP Logon is validated using the FTP logon
details specified in the FTP logon section on the Data Transfer Method tab
of the Output page. The value specified in Application Server on the
General tab of the Output page is used as the FTP server. Path of Remote
File is validated to check if it has a value. If no value is specified in Abso-
lute Remote Path and Path of Remote File, an error occurs.
• SAP Connection/Logon. SAP connection and logon information is vali-
dated based on the method selected to load and run the ABAP program.
For Design time or Manual Load and Manual Run, SAP logon informa-
tion is not required and is not checked. However, for a Runtime Load or
when DataStage runs the ABAP program, the SAP connection and logon
are validated using the values specified on the General tab of the Output
page.
• Check installed RFC. As for the CPI-C transfer type, the validation routine
checks whether the correct version of RFC (Z_RFC_DS_SERVICE) is
installed and activated on the SAP R/3 system.
• ABAP Options. Again, based on the method selected to load and run the
ABAP program, this section requires different validation sequences. For
Design time Load or Manual Load and Manual Run, the validation
routine only checks whether the ABAP program exists in the DataStage
Repository. In the case of Runtime Load or when DataStage runs the ABAP
program, the validation routine also checks the syntax of the ABAP
program.

CPI-C Transfers
You can also perform validations for CPI-C data transfers. For these transfers, the
following operations are considered:
• SAP Connection/Logon. SAP connection and logon are validated using the
details specified on the General tab of the Output page. However, if you
specify CPI-C User Name and Password in the CPI-C logon details section,
these values are used instead. After a successful logon, the system also
determines whether the user type is CPI-C and indicates this with a light.
• Check installed RFC. This component is shipped with the Ascential PACK
for SAP R/3 and must be installed on the SAP R/3 system. The validation
routine checks whether the correct version of RFC (Z_RFC_DS_SERVICE)
is installed and activated on the SAP R/3 system.

The ABAP Plug-In 2-51


PACK4SAPR3.book Page 52 Thursday, January 15, 2004 11:31 AM

• ABAP Options. Based on the method selected to load the ABAP program,
this section requires one of the following validation sequences:
– Design time Load and Manual Load. The validation routine only checks
whether the ABAP program exists in the DataStage Repository.
– Runtime Load. The validation routine also checks whether the ABAP
program is syntactically correct.

Local File Transfers


Local File transfers require no validations.

2-52 PACK for SAP R/3


PACK4SAPR3.book Page 1 Thursday, January 15, 2004 11:31 AM

3
The IDoc Extract Plug-In

Introduction
The SAP R/3 suite of applications supports ERP (Enterprise Resource Planning),
integrating the supply-chain processes of a company. An IDoc (intermediate
document) is a report, that is, a hierarchal package of related records, generated by
SAP R/3 in an SAP proprietary format. An IDoc, whose transmission is initiated
by the source database, exchanges data between applications. It provides a
standard format for exchanging data with SAP R/3. Individual IDocs contain the
data that make up a business transaction (for example, a sales order) or master data
(for example, material master) and include segment records and a control record.
Part 2 of this technical bulletin describes the DataStage IDoc Extract for SAP R/3
plug-in stage, which lets DataStage capture IDocs from R/3 source systems to be
used as source data for DataStage job data streams. It lets you browse SAP R/3
IDoc meta data, select IDoc types to process, and extract the data from IDocs.
This part of the technical bulletin describes the following for the IDoc Extract for
SAP R/3 plug-in, which works with DataStage 6.0 or later:
• Functionality
• Using DataStage to process SAP IDocs
• Configuration requirements
• Runtime components
• IDoc Extract for SAP R/3 plug-in stage
• Connection to SAP
• Output link definitions
• DataStage Administrator for SAP
• Properties
• File permissions
The IDoc Extract plug-in includes a set of tools and a custom GUI to passively
retrieve and process IDocs generated by SAP R/3. It complements the DataStage

The IDoc Extract Plug-In 3-1


PACK4SAPR3.book Page 2 Thursday, January 15, 2004 11:31 AM

ABAP Extract for SAP R/3, which is described in part 1 of this technical bulletin.
You can use only standard SAP interfaces to access IDoc meta data and content. No
ABAP code is uploaded to R/3 source systems for IDocs.
NLS (National Language Support) is supported for the IDoc Extract plug-in.
For information about using plug-ins, see the DataStage documentation. See
“Terminology” on page 1-7 for a list of the terms used in this chapter.

Functionality
The IDoc Extract plug-in has the following functionality:
• Retrieval of IDocs generated by SAP R/3 as source data for DataStage job
data streams.
• Simultaneous connections to multiple SAP R/3 instances from DataStage.
• Ability to define a unique directory for each IDoc type, R/3 source
combination.
• IDoc meta data browser capability.
• Automatic and manual job processing modes.
• Automatic re-connections to SAP R/3.
• A separate client utility, the DataStage Administrator for SAP. It manages
and configures R/3 connection and IDoc type properties.
• A persistent staging area (PSA) on the DataStage server for storage of IDocs
retrieved from R/3 source systems.
• Performance features for a high volume of data.
• A mechanism for coordinating the processing of IDocs from the PSA.

Using DataStage to Process SAP IDocs


The following steps describe how to process SAP IDocs using DataStage:
1. Use DataStage Administrator for SAP to define a connection to the SAP
database.
2. Use the SAP GUI to define a tRFC port and a logical system to represent
DataStage on the SAP system.
3. Create DataStage jobs to process IDocs of particular IDoc types.

3-2 PACK for SAP R/3


PACK4SAPR3.book Page 3 Thursday, January 15, 2004 11:31 AM

4. Configure the SAP system to send IDocs of these types to DataStage.


The source data is delivered to DataStage rather than being actively requested
from the source as in the traditional data access plug-ins. Consequently, a
background listener receives the IDocs as an R/3 source system delivers them to
DataStage. The listener stores each received IDoc in a persistent staging area (PSA),
which conceptually serves as the data source for the IDoc Extract plug-in. This
plug-in is a source stage, which means that it has no input links. It reads the IDoc
from the PSA, parses its content, and forwards this content as relational rows down
the output links of the stage for processing by downstream stages.
The custom GUI, which interfaces directly with an R/3 system, helps you design
jobs that use the IDoc Extract plug-in. You can select from a list of available IDoc
types and choose the IDoc segments you want. IDoc segment meta data is
automatically retrieved and used to populate a column grid for each output link
associated with a segment.
The following diagram illustrates the architecture of the system. The runtime
components include the listener sub-system and the IDoc Extract plug-in. These

The IDoc Extract Plug-In 3-3


PACK4SAPR3.book Page 4 Thursday, January 15, 2004 11:31 AM

two components provide the IDoc retrieval and processing functions for the
system, as described in the following sections.

Configuration Requirements
DataStage Systems. You need to install the following components: two on the
DataStage server system and two on the client.
• DataStage server system:
– Listener sub-system, which installs and registers the listener manager as
a service executable. (See “Listener Sub-System” on page 3-5 for more
information.)
– IDoc Extract plug-in.
• DataStage client system:
– IDoc Extract client GUI.

3-4 PACK for SAP R/3


PACK4SAPR3.book Page 5 Thursday, January 15, 2004 11:31 AM

– DataStage Administrator for SAP utility.


As with the ABAP Extract plug-in, the installation is facilitated by a prior
installation of the SAP GUI.
R/3 Source Systems. Some configuration on all R/3 source systems is required to
identify DataStage as a target system. Otherwise, the listeners cannot connect to
RFC ports in order to receive IDocs.
In order for our system to acknowledge receipt of IDocs from R/3, it must send a
message of type STATUS back to the sending R/3 system. The logical system
corresponding to our RFC server can participate in various distribution model
views. In these cases, these distribution model views must be configured to let us
send messages of type STATUS that use an IDoc type of SYSTAT01 back to the
sending R/3 system. Additionally, the partner profile should be configured to
collect status messages to avoid potential locking issues.
Install the IDoc Extract plug-in from the DataStage CD-ROMs as described in
Chapter 1 “Installation.”

Note: You must use a username and non-blank password for DataStage logon
credentials to use the IDoc plug-ins. This means you must clear Omit on
the Attach to Project dialog in DataStage, otherwise, unexpected results
occur.

Runtime Components
The IDoc Extract includes the listener sub-system and the DataStage plug-in
components. The following sections describe the listener sub-system.
For information about the plug-in stage and the administrative functions, see
“IDoc Extract for SAP R/3 Plug-in Stage” and “The Administrator for SAP
Utility”on page 3-15 and page 6-1 respectively.

Listener Sub-System
The listener sub-system includes the following components:
• Listener manager
• One or more RFC listener servers
This architecture is similar to that of the BW Load PACK (see DataStage Load
PACK for SAP BW Plug-In (74-0126). It lets multiple R/3 sources deliver IDocs to
DataStage. It runs as a daemon on UNIX, as a service on Windows NT.

The IDoc Extract Plug-In 3-5


PACK4SAPR3.book Page 6 Thursday, January 15, 2004 11:31 AM

Listener Manager
The listener manager detects changes in RFC servers by doing the following:
1. At startup, the manager reads a configuration file that contains parameters
describing R/3 connections, which are defined by the DataStage job designer.
2. For each configured R/3 connection, the listener manager starts an RFC
listener server in a separate background process. The manager ensures
that all its associated listener servers remain active and connected to R/3.
3. If it finds any irregularities, it tries to restart the server in question.
4. Additionally, if connection parameters change while a server is
connected, the manager stops the server and uses the updated connection
parameters to restart it.
5. The listener sub-system sends any error messages or messages reported
by R/3 to a log file, one for each listener. The log file contains error
messages encountered by the listener sub-system as well any messages
reported by R/3. You can view the contents of the log file manually, or
use the DataStage Administrator for SAP utility. (See “The Administrator
for SAP Utility” on page 6-1.)

Listener Server
Listener servers run in the background, listening on R/3 transactional RFC (tRFC)
ports, waiting for IDocs to be delivered by R/3. An R/3 administrator must
configure tRFC ports, identifying DataStage as a target system. When a server
receives IDocs, it saves them as flat files to a PSA on disk and sends receipt
confirmation to R/3.
The stage runtime component parses the IDoc content and forwards the content as
relational rows down the output links for the stage for processing by downstream
stages.

Starting and Stopping RFC Listener Manager on Windows NT


The DataStage RFC Listener Manager is a Windows NT service that starts up
automatically by default when the operating system is started.
If you need to stop or start the RFC Listener Manager on Windows NT, we
recommend that you use the Windows NT Service Manager dialog as follows:
1. Choose Start ➤ Settings ➤ Control Panel to display the Windows NT
Control Panel.

3-6 PACK for SAP R/3


PACK4SAPR3.book Page 7 Thursday, January 15, 2004 11:31 AM

2. Double-click Services to display the Services dialog box.


3. Select the Ascential DataStage (IDoc Manager) service.
4. Click the start/stop button to start or stop the RFC Listener Manager.
5. Click Close to exit the Services dialog box.
If you shut down the RFC Listener Manager, the RFC Listener Manager shuts
down all individual RFC Listener Servers.
For details about using the Windows NT Services dialog, see the Windows NT
documentation and online help.

Starting and Stopping the RFC Listener Manager On UNIX


The DataStage RFC Listener Manager is a UNIX daemon that starts up
automatically by default when the operating system is started.
If you need to stop or start the RFC Listener Manager on UNIX, we recommend
that you run the dsidocd.rc script for your platform with the stop or start command
options. The exact script name varies by platform. The following table lists the
names of the scripts for the various platforms:

Platform Script
Solaris /etc/rc2.d/S99dsidocd.rc
HP-UX /sbin/rc2.d/dsidocd.rc
AIX /etc/dsidocd.rc
Linux /etc/rc2.d/S999dsidocd.rc
Compaq Tru64 UNIX /sbin/rc2.d/S99dsidocd.rc

Example. AIX platforms:


To start the DataStage RFC Listener Manager:
/etc/dsidocd.rc start
To stop the DataStage RFC Listener Manager:
/etc/dsidocd.rc stop
If you shut down the RFC Listener Manager, the RFC Listener Manager shuts
down all individual RFC Servers.
The following sections describe the execution modes available on a per-IDoc basis.

The IDoc Extract Plug-In 3-7


PACK4SAPR3.book Page 8 Thursday, January 15, 2004 11:31 AM

Job Execution Mode


You can use automatic or manual modes for job execution on a per-IDoc basis.
• Automatic. The server starts DataStage jobs when a threshold of IDocs of a
specific type arrive (see the following section, “Batch Processing Mode”).
• Manual. DataStage receives IDocs and sends them to a PSA without
processing. The DataStage operator schedules and runs the DataStage jobs
that process the specified IDoc.
DataStage job execution mode is a property of the DataStage configuration for an
IDoc type. This implies that all jobs associated with a particular IDoc type and PSA
run in automatic or manual mode.

Batch Processing Mode


The server can launch a DataStage job after a listener receives a specified number
of IDocs of a specified type. For example, you can configure the listener sub-system
to launch DataStage jobs that process IDoc type MATMAS01 after 100 instances of
this IDoc are delivered.
An instance of an IDoc type is sometimes called a message. Even if IDoc messages
of a particular type are received from multiple tRFC ports by multiple listener
servers, only the server that received the IDoc completing the batch launches a job.
It resets the batch count so subsequent arrivals of the IDoc do not trigger a job.
As in job execution mode, batch processing is a property of an IDoc type, not a
DataStage job. The batch count is used only to determine when a DataStage job is
launched. New IDocs can arrive after the job starts. Therefore, the job can process
more IDocs than the batch count for its type specifies. The default batch count is
one. You should adjust this accordingly based on the expected volume of an IDoc
and the desired frequency of DataStage job runs.

Note: A DataStage limitation prevents multiple instances of the same job from
running simultaneously. Therefore, the listener has to queue job requests or
combine batches if the threshold for launching a job is reached before the
current job instance is complete. You may need to adjust the batch count to
an appropriate value to prevent this situation from occurring.

Persistent Staging Area


Because IDocs are not immediately processed by DataStage jobs when they are
received from R/3, they must be stored locally with respect to the DataStage
server. When you start a DataStage job manually using the DataStage Director or

3-8 PACK for SAP R/3


PACK4SAPR3.book Page 9 Thursday, January 15, 2004 11:31 AM

start one from a listener server, the IDocs are read as text files from local storage.
Local storage refers to the file system of the computer, which conceptually serves
as the data source for the IDoc Extract plug-in. This local storage on the file system
is referred to as the Persistent Staging Area (PSA). The PSA comprises a collection
of individual directories that you configure when designing DataStage jobs or
configuring IDoc types with the DataStage Administrator for SAP utility.
The PSA is a configurable property of an IDoc type and the R/3 connection from
which it arrives.
• Type. You can define a separate directory for each IDoc type that the
DataStage external system can receive. That is, you can specify a directory
of your choice for each IDoc type that is represented by some DataStage
job.
• Connection. Since DataStage jobs must also specify an R/3 connection,
IDocs can be segregated by connection as well as type. For example, a
single DataStage instance can run production level jobs against an R/3
production environment while simultaneously serving as a development
or QA environment for the respective R/3 instances. In another scenario,
you can configure DataStage to listen to more than one production R/3
instance for the same or different IDoc types.
Example. The following diagram represents four RFC listener servers associated
with a single DataStage server instance. Each RFC listener server listens on one of
four R/3 instances: development, QA, and two production instances. Assume six
DataStage jobs are on this server.

The IDoc Extract Plug-In 3-9


PACK4SAPR3.book Page 10 Thursday, January 15, 2004 11:31 AM

The relationship between DataStage jobs and the IDoc types they process is as
follows:
• J1 processes IDoc type A from the R/3 development instance.
• J2 processes IDoc type B from the R/3 development and QA instance.
• J3 processes IDoc type A from the R/3 QA instance.
• J4 processes IDoc type A from the R/3 production instance P1.
• J5 processes IDoc type B from the R/3 production instance P1.
• J6 processes IDoc type C from the R/3 production instance P2.
Security. In addition to offering flexibility in determining how IDocs from various
R/3 instances are processed, this method lets you have full control over the file
permissions for the directories where the IDocs are stored. Thus, varying levels of
security can be applied based on IDoc type. However, the owner of the listener
server process must have write permissions on this directory, and the owner of the
process that executes the DataStage job must have read and write permissions.

PSA Maintenance
Because of potential limitations in available disk space, the file system may not
have the capacity to store every IDoc that DataStage receives. To minimize the
possibility of file systems becoming full due to an excess of IDocs in the PSA, the
IDoc Extract includes the DataStage Administrator for SAP utility for cleaning up
or archiving IDocs that have been processed by DataStage jobs and are no longer
needed. During installation, a separate executable is scheduled to be run
periodically (weekly every Saturday by default) by the operating system. Use the
utility to modify the default scheduling of the executable or to run the
cleanup/archive manually. A bookmark file determines when processed IDocs are
deleted from persistent storage.
You can achieve an additional level of control by using manual removal or
archival. Even though this task is performed automatically, you can request a
manual removal of processed IDocs of a particular type at any time (see “About the
IDoc Cleanup and Archiving Page” on page 6-15).
When the process runs, it scans the PSA, identifying all IDocs that have been
successfully processed by all jobs having an interest in the particular IDoc. When
these IDocs are identified, they are deleted from the file system or archived to
another location. If they are archived, it is the responsibility of the user to maintain
the archive.
Inactive jobs. Inactive jobs are those that have not recently run. If a job stops being
run, IDocs that are to be processed by the job accumulate. They are not cleaned up
because the cleanup process detects that the inactive job has not yet processed the

3-10 PACK for SAP R/3


PACK4SAPR3.book Page 11 Thursday, January 15, 2004 11:31 AM

IDocs and may never process them. If the job never runs again, the IDocs will
accumulate indefinitely.
To resolve this issue, a job attains an inactive status with respect to the listener sub-
system when it has not been run for 40 days. Forty days is the default value, which
you or the DataStage Administrator can modify using the DataStage
Administrator for SAP. When a job becomes inactive, its unprocessed IDocs are not
saved or archived by the cleanup process. For details, see “The Administrator for
SAP Utility” on page 6-1.

Configuration Files
Because R/3 connection parameters and IDoc type properties are outside the scope
of individual DataStage jobs, they are not stored in job definition files in the
DataStage Repository. External configuration files store these parameters and
properties. We recommend that you use the IDoc Extract GUI to modify settings
for these configuration files. This section describes the various configuration files
that are used by the server to manage R/3 connection parameters and IDoc type
properties:
• DSSAPConnections.config
• IDocTypes.config
• <IDocType>.config

Note: On UNIX platforms, the dsenv file in the DataStage server directory must
contain a umask setting so that users have read and write permissions on
these configuration files. (Windows NT platforms handle this
automatically.)

DSSAPConnections.config File
The DSSAPConnections.config file stores R/3 connection parameters and is
located in the DSSAPConnections subdirectory of the DataStage server directory.
It contains all the information specific to the individual physical R/3 connections
that have been configured. R/3 connections are typically configured during job
design (see “Defining the DataStage Connection to SAP” on page 3-20). However,
you can also configure them using the DataStage Administrator for SAP. These
configuration files should never be directly edited (see “The Administrator for
SAP Utility” on page 6-1).
An example of the R/3 connections configuration file is shown below:
DSSAPCONNECTIONS=<BEGIN>
<BEGIN>
DSPASSWORD=py~sZiv.

The IDoc Extract Plug-In 3-11


PACK4SAPR3.book Page 12 Thursday, January 15, 2004 11:31 AM

SAPROUTERSTRING=
DEFAULTLANGUAGE=EN
DATAFILESPATH=
SAPSYSNUM=02
DEFAULTUSERNAME=p45user
DESCRIPTION=SALES logical system on ultra - P45 instance
SAPGROUP=
DSUSERNAME=dsuser1
SAPMESSERVER=
DEFAULTCLIENT=800
ALLOWSAPTORUNJOBS=TRUE
SAPAPPSERVER=R3sys1
LISTENFORIDOCS=TRUE
NAME=P45VERSION3
SAPSYSID=
REFSERVERPROGID=idoctest3
USELOADBALANCING=FALSE
DEFAULTPASSWORD=JSKM~
<END>
<END>
Fields of the DSSAPConnections.config File. Each connection is delimited by
<BEGIN>/<END> pairs. The fields within a connection block in the
DSSAPConnections.config file are defined as follows:
• NAME. A tag identifying the physical connection.
• DSPASSWORD. An encrypted password used by a listener server to
launch DataStage jobs.
• DEFAULTLANGUAGE. One of the SAP mnemonics representing the
native language of an RFC client connection. RFC client connections are
used by the IDoc custom GUI and by the listener when sending status
information back to R/3.
• DEFAULTUSERNAME. The username for connecting to R/3 as an RFC
client.
• DSUSERNAME. The username used by a listener server to launch
DataStage jobs.
• DEFAULTCLIENT. The R/3 client number used to connect to R/3 as an
R/3 client.
• SAPROUTERSTRING. The router string used to connect to R/3 as an RFC
client.

3-12 PACK for SAP R/3


PACK4SAPR3.book Page 13 Thursday, January 15, 2004 11:31 AM

• SAPSYSNUM. The R/3 system number used to connect to R/3 as an RFC


client.
• REFSERVERPROGID. The program ID of a tRFC port used to connect to
R/3 as an RFC server. The listener servers use this when connecting to R/3
to listen for IDocs.
• DEFAULTPASSWORD. An encrypted password used to connect to R/3 as
an RFC client.
• SAPAPPSERVER. The host name of the system running the R/3 instance.
• USELOADBALANCING. The load balancing specification. If true, client
connections are made through a message server rather than an application
server.
• SAPMESSERVER. The message server host that handles the connection
request when using load balancing.
• SAPSYSID. The R/3 system name that handles the connection request
when using load balancing.
• SAPGROUP. The group name of the application servers when using load
balancing.

IDocTypes.config File
When the listener manager starts, it starts a listener server for each connection
described in this file. The previous example contains a single connection,
PA45VERSION3.
When an IDoc type is first associated with a connection during job design, the
client GUI creates a directory in the DataStage server directory called
DSSAPConnections/<ConnectionName>/IDocTypes/<IDocTypeName>.
• ConnectionName specifies one of the connections defined by the NAME=
parameter in the DSSAPConnections.config file.
• IDocTypeName is a directory that specifies the name of an IDoc type.
For each configured connection, a corresponding subdirectory
(DSSAPConnections/<ConnectionName>) is created.
Each of these subdirectories contains an IDocTypes directory that contains a
configuration file named IDocTypes.config and further subdirectories, one for each
IDoc type configured. The type is received from the respective connection.

The IDoc Extract Plug-In 3-13


PACK4SAPR3.book Page 14 Thursday, January 15, 2004 11:31 AM

The IDocTypes.config file specifies the directory locations for writing each type of
IDoc that has been configured for that connection. It has the same format as the
DSSAPConnections.config file, but it contains different fields. For example:
DSIDOCTYPES=<BEGIN>
<BEGIN>
USE_DEFAULT_PATH=FALSE
IDOC_FILES_PATH=/u1/IDocs/MATMAS01
NAME=MATMAS01
<END>
<BEGIN>
USE_DEFAULT_PATH=TRUE
IDOC_FILES_PATH=
NAME=CREMAS01
<END>
<END>
The fields within an IDoc type block are defined as follows:
• NAME. The name of the IDoc type.
• USE_DEFAULT_PATH. A value of TRUE indicates that DataStage defines
the default directory to serve as the PSA for this IDoc type. If TRUE, IDocs
received by the listener are stored in the corresponding IDocTypeName
directory in the IDocTypes directory.
• IDOC_FILES_PATH. If USE_DEFAULT_PATH is FALSE, specifies the
user-defined directory that serves as the PSA for this type.

<IDocType>.config File
The <IDocType>.config exists in every IDocTypeName directory that is configured as
a destination for a particular IDoc type, either by default or that you explicitly
specify.
IDocType is the type name of the IDoc. This configuration file contains all the
information that is specific to that IDoc directory location. For example:
IDOC_COUNT=100
DSUSERNAME=johnharvey
DSPASSWORD=py~sZiv.
AUTORUN_ENABLED=TRUE
ARCHIVE_IDOC_FILES=FALSE
USE_DEFAULT_LOGON=FALSE

3-14 PACK for SAP R/3


PACK4SAPR3.book Page 15 Thursday, January 15, 2004 11:31 AM

The fields in the IDocType.config file are defined as follows:


• IDOC_COUNT. The value indicating the number of IDocs to be collected
in the PSA. They are collected before a DataStage job is launched by a
listener server to process the IDoc.
• DSUSERNAME. The DataStage user name used by the listener server to
connect to DataStage in order to launch a job. If USE_DEFAULT_LOGON is
TRUE, DSUSERNAME is ignored.
• DSPASSWORD. The encrypted DataStage password used with DSUSER-
NAME. If USE_DEFAULT_LOGON is TRUE, DSPASSWORD is ignored.
• AUTORUN_ENABLED. If set to TRUE, DataStage jobs associated with the
IDoc type are launched automatically by the listener server when
IDOC_COUNT is reached.
If AUTORUN_ENABLED is set to FALSE, DataStage jobs are not started by
the listener server, and IDOC_COUNT is ignored.
• ARCHIVE_IDOC_FILES. If set to TRUE, the cleanup process moves the
IDocs to a default location rather than deleting them from the PSA.
• USE_DEFAULT_LOGON. If this field is set to TRUE, the listener server
uses the DSUSERNAME/DSPASSWORD combination in the DSSAPCon-
nections.config file that is configured for the connection. If it is set to FALSE,
it uses the DSUSERNAME and DSPASSWORD combination in this <IDoc-
Type>.config file.
For more information about the management of these configuration files, see “The
Administrator for SAP Utility” on page 6-1.

IDoc Extract for SAP R/3 Plug-in Stage


IDocs that are received by a listener server from R/3 are saved to disk. DataStage
jobs do not directly read IDocs from an R/3 system. Instead, they are read from the
directory specified in the configuration file for the particular IDoc type as
described in “Configuration Files” on page 3-11. You can use the IDoc Extract plug-
in to start DataStage jobs automatically using a request from a listener server, or
you can use the DataStage Director or dsjob to start them.
The IDoc Extract plug-in stage is a passive stage that has output links but no input
links.

The IDoc Extract Plug-In 3-15


PACK4SAPR3.book Page 16 Thursday, January 15, 2004 11:31 AM

The IDoc Extract plug-in for release 5 of the Ascential PACK for SAP R/3 uses this
icon:

Each output link from the IDoc Extract stage is mapped to a single IDoc data
segment. You should create links for each IDoc segment whose data is of interest
when you design a job.
The following screen uses IDoc type MATMAS01 as an example:

If you process only segments E1MARAM and E1MAKTM, for example, the data
contained in other segments is ignored. Segments are records within an IDoc that
can be child or parent segments. They include administrative fields such as the
IDoc number, the segment number, and the number of the parent segment. Child
segments have a many-to-one relationship to parent segments.
The selection of E1MARAM, which has child segments, does not imply that all its
child segments are included. Only the segment fields of E1MARAM are processed.

3-16 PACK for SAP R/3


PACK4SAPR3.book Page 17 Thursday, January 15, 2004 11:31 AM

The segment fields for E1MARAM correlate with the column definitions for the
output link on which E1MARAM is associated. The segment fields for E1MAKTM
become the column definitions for a second output link. Use the custom GUI to
define the column meta data definitions. When a segment is chosen for a link, the
GUI displays the available segments fields that you can select for output. The GUI
then automatically translates the R/3 meta data for the selected segment fields to
equivalent DataStage meta data definitions and populates the column grid.
Examples. The following screens illustrate the translation of IDoc segment field
E2MAKTM meta data to DataStage columns with DataStage meta data:

Note: DataStage uses the segment definition name, for example, E2MAKTM001,
rather than the segment type name. The distinction between the two is
beyond the scope of this document. For more information, consult your
SAP documentation.

The IDoc Extract Plug-In 3-17


PACK4SAPR3.book Page 18 Thursday, January 15, 2004 11:31 AM

This R/3 meta data is translated to DataStage meta data as:

Later sections (starting with “Defining the DataStage Connection to SAP” on


page 3-20) describe the GUI in detail.
Bookmark Files. When the job is run, the stage reads unprocessed IDocs of the
specified type from the appropriate location in the PSA. Unprocessed IDocs are
identified by a bookmark file that is maintained for each IDoc type. (See the next
section and “PSA Maintenance” on page 3-10 for details about bookmarks.)
When subsequent DataStage jobs are run, only those IDocs that arrived following
the current bookmark are processed. This prevents a job from repeatedly
processing the same set of IDocs.
If a job fails, you can use the reset functionality for the job to restore the bookmark
to its original state.
Test Mode. To prevent the bookmark for the job from being updated, you can
process IDocs in Test Mode (set on the General tab of the Stage page). For more
information, see “About the Stage General Page” on page 3-21.

3-18 PACK for SAP R/3


PACK4SAPR3.book Page 19 Thursday, January 15, 2004 11:31 AM

Bookmarking the PSA


A bookmark file for each IDoc type identifies IDocs that have not yet been
processed by a DataStage job. A single bookmark file can independently account
for any number of DataStage jobs designed to process the particular IDoc type.
The bookmark is updated when a job begins. If a job aborts, a job reset causes the
bookmark to be reset to its state before the aborted run. Therefore, the same batch
of IDocs can be reprocessed without data loss.
This scheme lets more than one job process IDocs of the same type from the same
location in the PSA. Since the bookmark coordinates the access for each job to
IDocs of the same type in a given PSA, it is not possible for one job to exhaust the
IDocs before a second job has the opportunity to process them.

Note: On UNIX platforms, the DataStage server must be started so that users who
want to run a job have write permission on the bookmark file. This also
applies to the saprfc.ini file, which the job maintains in the DataStage project
directory.

Creating DataStage Jobs for IDoc Types


The following steps let you create a DataStage job for an IDoc type:
1. Open a new DataStage job.
2. Add an IDoc Extract stage.
3. Create additional stages to process IDoc segment records.
4. Create links between the IDoc stage and the stages that process the
segment records.
5. Define the properties of the IDoc Extract stage.
6. Save and compile the job.
For information about using plug-ins in DataStage jobs, see DataStage Server Job
Developer’s Guide.

The IDoc Extract Plug-In 3-19


PACK4SAPR3.book Page 20 Thursday, January 15, 2004 11:31 AM

Defining the DataStage Connection to SAP


A summary of the interaction of the stage and the links follows. The subsequent
sections give further details.
• The stage reads multiple IDocs of a particular type arriving through a
particular connection.
• The connection to an SAP R/3 system and the IDoc type properties are
defined as stage properties.
For information about selecting an IDoc type and configuring the IDoc
processing properties see “Selecting IDoc Types” on page 3-23 and
“Importing IDoc Type Configurations” on page 6-11 respectively.
• Each link carries segment records from a particular segment type or control
records.
• The segment type or the control record is defined as a link property. Select
segments on the output links to retrieve IDoc meta data, and translate the
segment fields into DataStage column definitions (see “Defining IDoc
Extract Output Links” on page 3-30).
When you start the plug-in GUI from the DataStage Designer to edit an IDoc
Extract stage, the General tab of the Stage page appears:

3-20 PACK for SAP R/3


PACK4SAPR3.book Page 21 Thursday, January 15, 2004 11:31 AM

This dialog has the Stage and Output pages:


• Stage. This page displays the name of the stage you are editing. This page
contains the General and the IDoc Type tabs. The selected SAP connection
parameters are displayed on the General tab on the Stage page. Generally
the connection is already defined by default. See the following sections for
details.
Use the IDoc Type tab of the Stage page to select an IDoc type (see
“Selecting IDoc Types” on page 3-23).
• Output. You assign the control record or a segment to each output link
here. You can also modify the corresponding columns that are generated.
(See “Defining IDoc Extract Output Links” on page 3-30.)

About the Stage General Page


The parameters that are needed by DataStage to connect to an SAP database are
defined on the General tab on the Stage page.
To connect to an SAP R/3 system:
1. DataStage Connection to SAP. The DataStage connection to the SAP R/3
system that is defined on the DataStage server machine and shared by all
DataStage users connected to that machine. The fields in this area are read-
only and are obtained from the connection that you selected. For more infor-
mation, see “Selecting the DataStage Connection to SAP” on page 3-22.
a. Name. The name of the selected connection to the SAP R/3 system that
generates the IDocs to be extracted by this stage.
b. Select… . Click the button to choose a DataStage connection to the SAP
R/3 system. The selected connection provides all needed connection and
default logon details that are needed to communicate with the corre-
sponding SAP R/3 system.
c. Description. Additional information about the selected connection.
d. Application Server. The name of the host system running R/3. If the
selected connection uses load balancing, Message Server is used instead.
e. System Number. The number assigned to the SAP R/3 system used to
connect to R/3. If the selected connection uses load balancing, System ID
is used instead.
2. SAP Logon Details. The fields in this area are read-only unless you clear
the Use connection defaults box.

The IDoc Extract Plug-In 3-21


PACK4SAPR3.book Page 22 Thursday, January 15, 2004 11:31 AM

a. User Name. The user name that is used to connect to SAP.


b. Password. A password for the specified user name.
c. Client Number. The number of the client system used to connect to SAP.
d. Language. The language used to connect to SAP.
e. Use connection defaults. Clear this box to remove the default SAP Logon
Details settings so you can use different logon information. If selected, the
displayed logon details are obtained from the selected connection and are
disabled.
3. Enter an optional description to describe the purpose of the stage in the
Description field.
4. Test Mode. The check box that specifies whether the job runtime compo-
nent updates the bookmark file for this stage. If you select this check box,
the job runtime does not update the bookmark file for this stage. There-
fore, each time the job is run, this stage reads all the IDocs that it read
when the job was previously run.
Use Test Mode as you develop a job and need to test it repeatedly with the
same set of input data without causing the bookmark to be updated.
Selecting Test Mode also prevents the job from being run automatically as new
IDocs of this type arrive. The listener server does not launch a job in test mode
even if its IDoc type is designated for automatic execution.
The Administrator can archive processed IDocs rather than permanently
remove them by moving them to a subdirectory of the PSA for the IDoc.
Currently, the only way to reprocess IDocs is to reset an aborted job or set the
job to Test Mode. You cannot reprocess archived IDocs. For more information
about administrative functions, see “The Administrator for SAP Utility” on
page 6-1.
For a newly created stage, the connection for the stage defaults to the last one you
selected for another stage of this type. If only one DataStage connection to an SAP
R/3 system is defined on the DataStage server machine, a new stage defaults to
that connection regardless of whether you previously created an IDoc Extract
stage.

Selecting the DataStage Connection to SAP


When you click Select on the General tab of the Stage page, the Select DataStage
Connection to SAP dialog opens. This dialog lets you choose a DataStage
connection to the SAP R/3 system. The selected connection provides all needed

3-22 PACK for SAP R/3


PACK4SAPR3.book Page 23 Thursday, January 15, 2004 11:31 AM

connection and default logon details that are needed to communicate with the
corresponding SAP R/3 system.

Add…, Properties…, Import into…, Export… and Remove are administrative


connection operations. They let you manage the list of connections that is
maintained on the DataStage server machine from the Select DataStage
Connection to SAP dialog. They function in the same way as the corresponding
buttons on the DataStage Connections to SAP page of the DataStage
Administrator for SAP utility.
For more information about managing the connections, see “The Administrator for
SAP Utility” on page 6-1.

Selecting IDoc Types


After you provide a DataStage connection to SAP on the General tab of the Stage
page, you select an IDoc type on the IDoc Type tab of the Stage page.

The IDoc Extract Plug-In 3-23


PACK4SAPR3.book Page 24 Thursday, January 15, 2004 11:31 AM

Configure an IDoc type as follows. The subsequent sections describe these steps in
detail.
1. Click Select… to open the Select IDoc Type dialog for selecting an IDoc type.
2. Select a type, and click OK.
3. A prompt may appear about an unconfigured IDoc type. Click Yes to
configure the IDoc type for use with DataStage.
4. Define the IDoc type configuration properties on the IDoc Type Proper-
ties dialog.

3-24 PACK for SAP R/3


PACK4SAPR3.book Page 25 Thursday, January 15, 2004 11:31 AM

5. The selected IDoc type and its component segment types are now visible
on the IDoc Type tab of the Stage page.

The Select IDoc Type dialog displays all the released IDoc types defined on the
SAP R/3 system. It has the following components:
• Connection. The connection name for the R/3 system whose IDoc types
are being shown is indicated at the top of the dialog.
• Description. A description of the R/3 system.
• IDoc Types. A list of all the released IDoc types defined on the R/3 system.
• Find… . Click to open a Find IDoc Type dialog. It lets you search for IDoc
Types that contain user-specified substrings in their name or description.
• Properties… . Click to view and change the DataStage configuration for the
selected IDoc type.
• OK. When you select an IDoc type and click OK, the system checks
whether anyone on the DataStage system configured the IDoc type for the
current connection. If not, you see the following message:
This IDoc type has not been configured for use with
DataStage. Would you like to set these options now?

The IDoc Extract Plug-In 3-25


PACK4SAPR3.book Page 26 Thursday, January 15, 2004 11:31 AM

Click Yes to set the options to default values. The IDoc Type Properties
dialog appears (see the following section).

Defining IDoc Type Properties


When you confirm the default values after selecting an IDoc type, the IDoc Type
Properties dialog opens:

The IDoc Type Properties dialog contains various parameters set to default
values. The default settings are usually appropriate.
1. Name is the name of the selected IDoc type (read-only) with its description in
the Description field.
2. Directory containing temporary IDoc files for this IDoc type and
connection is the pathname for storing IDoc files for this IDoc type and
connection. Initially, the directory is set to a default location, and the edit
box is read-only. To enter your own directory, clear the Use default direc-
tory box.
If Use default directory is selected, the default pathname is displayed, and the
control is read-only.
3. Click Browse… to browse for an alternate directory to store IDoc files for
this IDoc type and connection. If Use default directory is cleared, this
button is enabled.

3-26 PACK for SAP R/3


PACK4SAPR3.book Page 27 Thursday, January 15, 2004 11:31 AM

4. Click Use default directory to browse for an alternate directory for


storing IDoc files for this IDoc type and connection. If cleared, Directory
containing temporary IDoc files for this IDoc type and connection and
Browse… are enabled, and you can specify an alternate directory. Other-
wise, DataStage uses a default location.
If the directory does not exist, and you save changes to the IDoc type configu-
ration, the system tries to create the directory. If it cannot, you must choose
another directory.
If the directory has already been configured for this IDoc type using a different
DataStage connection to SAP, you are warned that stages that read IDocs of
this type process IDocs arriving from all connections that share this directory
for the type.
After you acknowledge this message, the values displayed in the dialog for the
other configuration details are refreshed to match those already defined for the
type through the other connections. Any changes you make to the configura-
tion details for an IDoc type whose PSA is shared between two or more
connections affect all connections processing this type from this PSA.
5. Click Archive processed IDoc files to archive processed IDocs rather
than permanently remove them. The archive location is at the same level
as the PSA for the IDoc.

Note: It is the responsibility of the Administrator to oversee the permanent


storage of archived IDocs, for example, moving them to an alternate
disk location or removable storage media. Since you cannot restore
processed IDocs, contact Ascential Customer Support if you need to
reprocess IDocs. For details about the administrative utility, see “The
Administrator for SAP Utility” on page 6-1.

6. If Run jobs that extract IDocs of this type after receiving n IDocs is
selected, jobs are automatically run that read IDocs of this type each time
another n IDocs of this type are received by the IDoc listener server. (The
default is one.) If the number of IDocs of this type is expected to be small
and to arrive frequently, increase the number of IDocs that must arrive
before jobs are automatically run.
Alternately, you can disable automatic jobs invocation for this IDoc type by
clearing the check box. In this case, use the DataStage Director to schedule jobs.
7. The DataStage Logon Details for Running the Jobs area specifies the
DataStage logon user name and password, and whether to use the
defaults for the connection.

The IDoc Extract Plug-In 3-27


PACK4SAPR3.book Page 28 Thursday, January 15, 2004 11:31 AM

8. R/3 Version specifies the version of the R/3 system that is set for the IDoc
type in the R/3 system itself. This lets you change segment meta data for
the IDoc types when you upgrade an R/3 system to a later version (see
“Specifying R/3 Versions” on page 3-29).
9. After you select an IDoc type and optionally define IDoc Type configura-
tion properties, the Select IDoc Type dialog closes, and you return to the
IDoc Type tab of the Stage page, with the properties of the selected IDoc
type now visible.

The IDoc Components area shows the control record and all the segments defined
for the IDoc type with their descriptions. (The control record, one for each IDoc, is
an administrative record that contains a standard set of fields describing the IDoc
as a whole.) This area contains the following information:
• Name. Shows the hierarchical relationship among the segments using a
tree structure with their descriptions.
(A segment type can appear only once within an IDoc type. The names for
the segments are the segment definition names, not the segment type
names. You can infer the segment type name from the segment definition
name.)
• Assigned Output Link. After particular segments in the IDoc Type are
assigned to the output links of the stage, the IDoc Components control
shows the names of the links in the Assigned Output Link column of the

3-28 PACK for SAP R/3


PACK4SAPR3.book Page 29 Thursday, January 15, 2004 11:31 AM

control. This gives you an overview of which segments are being extracted
by the stage.
The General tab of the Output page shows the segment type or control record for
each output link as described in subsequent sections.

Specifying R/3 Versions


When an R/3 system is upgraded to a later version, you can add new fields to the
segment definitions used by the IDoc types that are currently defined on the
system, thus changing segment meta data for the IDoc types.
To prevent third-party applications that receive IDocs from the R/3 system from
having to change to accommodate the new segment definitions, you can specify
that IDocs of a particular IDoc type that are sent through a particular port use the
segment definitions from an earlier version of the R/3 system.
If this is already done in your SAP application, set R/3 Version on the IDoc Type
Properties dialog to the same value that is set for the IDoc type in the R/3 system
itself. (See “Connection and Logon Details Page” on page 6-5.)
R/3Version lists the version that is the same or earlier than that of the R/3 system
that is specified in the connection. It defaults to the current version of the R/3
system.
Version 46E and Later. By default, the IDoc Extract product only knows about R/3
46D or earlier. If you have an R/3 system that is later than 46D, the program reads
the version from the R/3 system and automatically adds it to the list.
Missing Versions. But suppose the R/3 system is Version 46F. The combo box then
includes the 46F, but may be missing 46E. If you want to specify 46E, type the value
directly into the edit control of the combo box.
When you click OK, the system verifies that the specified R/3 Version is unknown,
and prompts:
R/3 version "46E" is not among the versions currently known to
DataStage. Would you like "46E" to be added to the list of known
versions?
If you click Cancel, the IDoc Type Properties dialog stays open.
If you enter an incorrectly formatted version and click OK, you see the following
error:

The IDoc Extract Plug-In 3-29


PACK4SAPR3.book Page 30 Thursday, January 15, 2004 11:31 AM

If you enter a version number that is later than that of the R/3 system specified in
the connection, you see the following error:

If you change the port version of the connection for the stage or the R/3 version of
the IDoc type for the stage, the stage automatically refreshes itself using
appropriate IDoc meta data. If the either of these versions is changed before you
open the stage editor, the act of opening the stage editor produces the following
warning:

Defining IDoc Extract Output Links


After you select an IDoc type, you assign the control record or a segment to each
output link. Begin the selection process from the General tab on the Output page.

3-30 PACK for SAP R/3


PACK4SAPR3.book Page 31 Thursday, January 15, 2004 11:31 AM

After you choose the output link from the Output name box, the following steps
summarize how output links function.
The subsequent sections describe these steps in detail.
1. Click Select… to open the Select IDoc Component to Extract dialog (see
“Selecting the IDoc Component to Extract” on page 3-32). The IDoc Compo-
nent to Extract, Description, and Fields in Component information is
displayed after you select a control record or a segment for the link.
2. The selected segment type and its fields now appear on the General tab
of the Output page.
The administrative fields common to all segment types appear at the begin-
ning of the list.
3. Columns are automatically generated for each link when the segment
type is selected.
4. You can delete unnecessary columns (administrative fields that are rarely
used are not visible).
5. Columns for missing fields can be added by clicking Add Columns on
the Columns tab of the Output page.

The IDoc Extract Plug-In 3-31


PACK4SAPR3.book Page 32 Thursday, January 15, 2004 11:31 AM

About the General Tab for the Output Page


This tab is displayed by default. The following sections describe how output links
function.

Selecting the IDoc Component to Extract


Click Select… on the Output General page to open the Select IDoc Component
to Extract dialog, which contains an IDoc Components control identical to the one
on the IDoc Type tab of the Stage page.

1. Select the control record or a segment, and click OK.


2. The dialog closes, and you return to the General tab of the Output page
with the IDoc Component to Extract, Description, and Fields in Compo-
nent information now shown (see a sample screen in the “Examples”
section, which begins on page 3-17).
If you selected a segment, the first seven fields in the Fields in Component list
are administrative fields common to all segments. These fields are followed by
the data fields specific to this segment type.

3-32 PACK for SAP R/3


PACK4SAPR3.book Page 33 Thursday, January 15, 2004 11:31 AM

Adding Columns
When you choose a segment or the control record on the General tab on the
Output page, a corresponding list of columns is automatically generated.
You can view and modify these columns using the Columns tab on the Output
page (see a sample screen in the “Examples” section, which begins on page 3-17).
To view and modify the columns:
1. Columns corresponding to segment administration fields are given names
prefixed by ADM_. The default column list does not include columns for the
less frequently used administrative fields (namely, SEGNAM, MANDT,
HLEVEL, and SDATA). You can also delete any unneeded columns.
The Corresponding Extract Field for Column "ADM_DOCNUM" is the
single-row grid near the bottom of the tab showing the extract field that corre-
sponds to the selected column, in this case, ADM_DOCNUM (the value
changes depending on the selected column). It includes Name, Description,
Internal Type, and Length information.
2. Click Add Columns… to add columns for any fields that are not
currently represented in the columns list. The Add Columns dialog
appears.

If you double-click a field or select one or more fields, and click Add, the
fields disappear from the Add Columns dialog.
Corresponding columns appear in the Columns tab. The new columns are
inserted into positions in the column list that match the sequence of the
corresponding fields in the segment. The columns are shown as selected and
are automatically scrolled into view.

The IDoc Extract Plug-In 3-33


PACK4SAPR3.book Page 34 Thursday, January 15, 2004 11:31 AM

After the columns are added, the Add Columns dialog remains open so you
can optionally add columns for remaining fields.
Click Close to exit the dialog and return to the Columns tab of the Output
page. The dialog also closes automatically if there are no columns left to be
added.
3. Click Validate Columns to check whether each column has a matching
extract field and that the properties for the column are consistent with
those of the field. If the properties of the column are inconsistent, the
program offers to correct the properties of the column.
4. Click Save… to open the Save IDoc Segment Definition dialog to save
meta data definitions for a job to the DataStage Repository. (See the
following section, “Saving IDoc Meta Data Definitions”.)

Saving IDoc Meta Data Definitions


You can save meta data definitions for a job to the DataStage Repository. Besides
being useful to other jobs that may process the target data of the IDoc Extract, the
stored meta data lets the DataStage Manager perform where-used analysis on IDoc
meta data. It also lets MetaStage analyze the meta data and perform various
queries against it.
The following screen displays the default values provided when you click the Save
on the Columns tab of the Output page:

The Save IDoc Segment Definition dialog contains the following fields:
• Data source type. The type is always IDoc.

3-34 PACK for SAP R/3


PACK4SAPR3.book Page 35 Thursday, January 15, 2004 11:31 AM

• IDoc type. The name of the data source.


• Segment name. The name of the table or file.
• Short description. The brief description of the IDoc segment.
• Long description. The detailed description of the IDoc segment.

The IDoc Extract Plug-In 3-35


PACK4SAPR3.book Page 36 Thursday, January 15, 2004 11:31 AM

3-36 PACK for SAP R/3


PACK4SAPR3.book Page 1 Thursday, January 15, 2004 11:31 AM

4
The IDoc Load Plug-In

Introduction
The SAP R/3 suite of applications supports ERP (Enterprise Resource Planning),
integrating the supply-chain processes of a company. An IDoc (intermediate
document) is a report, that is, a hierarchical package of related records, generated
by SAP R/3 in an SAP proprietary format. An IDoc, whose transmission is
initiated by the source database, exchanges data between applications. It provides
a standard format for exchanging data with SAP R/3. Individual IDocs contain the
data that make up a business transaction (for example, a sales order) or master data
(for example, material master) and include segment records.
This part of the technical bulletin describes the DataStage Load for SAP R/3 plug-
in, which is a passive stage that has input links but no output links. Use this plug-
in to generate IDocs from source stages to load data into SAP R/3.
This technical bulletin describes the following for Version 1.0 of the DataStage
Load for SAP R/3 plug-in, which works with DataStage Release 5.1, or later:
• Functionality
• Configuration requirements
• Defining the IDoc Load for SAP R/3 stage
The user interface for the DataStage IDoc Load for SAP R/3 stage is almost
identical to that of the DataStage IDoc Extract for SAP R/3 stage.
NLS (National Language Support) is supported for the DataStage Load for SAP
R/3 plug-in.
See ”Terminology” on page 1-7 for a list of the terms used in this chapter.

The IDoc Load Plug-In 4-1


PACK4SAPR3.book Page 2 Thursday, January 15, 2004 11:31 AM

For More Information. For information about using plug-ins or SAP, see the
following table:

If you want information on… Then see…


Using NLS DataStage Administrator Guide
Using a plug-in in a DataStage job DataStage Server Job Developer’s Guide
Using SAP SAP documentation

The IDoc Load for SAP R/3 plug-in has the following functionality:
• Ability to browse and select IDoc types from a meta data browser in the
GUI.
• Selection of IDoc segment types to which non-SAP data is loaded. The data
for each segment is read in on a separate link in parallel processes.
• Ability to allow the DataStage job designer to map relevant segment data
from multiple sources. Each IDoc Load stage in a DataStage job can send
IDocs of one chosen type to SAP R/3.
• A mapping mechanism that allows relational row data to be converted to
SAP’s proprietary IDoc data format. IDoc structural meta data determines
the link processing order. Join keys, which are sort keys, are identified by
the job designer. The processing order and join keys are used by the stage
to assemble IDocs.
• Support for message types, the scheme for electronically transmitted data
used for one specific business transaction.
• Support for primary key (P-key), and foreign key (F-key) handling, which
differs from that in the IDoc Extract for SAP R/3 plug-in. Foreign keys for
segments at root level of the IDoc hierarchy separate data into separate
IDocs, allowing one stage to send several IDocs in one transmission to SAP
R/3. Foreign keys in non-root level segments relate child segment records
to their parent segment record.
• NLS (National Language Support). For information, see DataStage Adminis-
trator Guide.
Automatic job execution is unsupported for the Parallel Canvas.

4-2 PACK for SAP R/3


PACK4SAPR3.book Page 3 Thursday, January 15, 2004 11:31 AM

Configuration Requirements
DataStage Systems. You need to install the following components: two on the
DataStage server system and two on the client.
• DataStage server system:
– IDoc Load for SAP R/3 plug-in.
• DataStage client system:
– IDocLoad for SAP R/3 client GUI.
– Administrator for SAP utility.
As with the ABAP Extract for SAP R/3, the installation is facilitated by a prior
installation of the SAP GUI.
R/3 Source Systems. Some configuration on all R/3 source systems is required to
identify DataStage as a target system.
Install the DataStage Load for SAP R/3 plug-in from the DataStage CD-ROMs as
described in “Installation” starting on page 1-1.
After installing the plug-in GUI, start the plug-in editor from the DataStage
Designer by doing one of the following:
• Double-clicking the stage in the Diagram window
• Selecting the stage and choosing Properties from the shortcut menu
• Selecting the stage and choosing Edit ➤ Properties from the DataStage
Designer window

Note: You must use a username and non-blank password for DataStage logon
credentials to use the IDoc plug-ins. This means you must clear Omit on
the Attach to Project dialog in DataStage, otherwise, unexpected results
occur.

IDoc Load for SAP R/3 Stage


The user interface for the IDoc Load for SAP R/3 stage is nearly identical to that of
the IDoc Extract for SAP R/3 stage, with input links instead of output links.

The IDoc Load Plug-In 4-3


PACK4SAPR3.book Page 4 Thursday, January 15, 2004 11:31 AM

The IDoc Load plug-in for release 5 of the Ascential PACK for SAP R/3 uses this
icon:

Each input link loads records for a particular segment type within the IDoc type
that is selected for the stage. A column list for each link is generated automatically,
based on the fields of the selected segment type. In addition, special columns are
generated to represent primary and foreign key values for each link. These key
values allow the runtime to determine which child segment records flowing into
one link belong to each parent segment record that flows into a different link.
For a root-level segment type, the foreign key value identifies the specific
generated IDoc into which the segment records will be incorporated.
Since the column lists for an IDoc Load for SAP R/3 stage are generated
automatically, Transformer stages map values from source data columns to the
columns generated from the fields of the segment types. In each Transformer stage,
you must also map source data values to values that can be used as primary and
foreign keys for the segments. Values for the key columns not actually loaded as
data into the IDocs, but are only used to correlate records flowing into separate
links.
You can design jobs in any way that provides effective key values. The foreign key
values provided for a link representing a parent segment type must exactly match
primary key values provided for a link that represents the parent segment type.
For root-level segment types, the foreign key identifies the IDoc for the segments.

Selecting the DataStage Connection to SAP


The GUI for the stage closely resembles that of the IDoc Extract for SAP R/3 stage.

4-4 PACK for SAP R/3


PACK4SAPR3.book Page 5 Thursday, January 15, 2004 11:31 AM

When you start the plug-in GUI from the DataStage Designer to edit an IDoc Load
for SAP R/3 stage, the General tab of the Stage page appears by default:

This dialog box has the Stage and Input pages:


• Stage. This page displays the name of the stage you are editing and
contains the General and IDoc Type pages. The selected SAP connection
parameters are displayed on the General tab on the Stage page. Generally
the connection is already defined by default. See the following sections for
details.
Use the IDoc Type tab of the Stage page to select a type of the IDocs to load
(see “Selecting IDoc Types” on page 4-10).
The NLS page defines a character set map to use with the stage. This page
appears only if you have installed NLS for DataStage (see “Defining
Character Set Mapping” on page 4-13).
• Input. This page specifies the IDoc components you assign to each input
link (see “Defining IDoc Load Input Links” on page 4-13).
The main phases in defining an IDoc Load for SAP R/3 stage from the Stage dialog
box are as follows. See the following sections for details.
1. Select a connection to SAP.
2. Select an IDoc type.
3. Optionally define a character set map.

The IDoc Load Plug-In 4-5


PACK4SAPR3.book Page 6 Thursday, January 15, 2004 11:31 AM

4. Define the data on the input links.


Click OK to close this dialog box. Changes are saved when you save the job design.

About the Stage General Page


The parameters that are needed by DataStage to connect to an SAP database are
defined on the General tab on the Stage page. You select a logical DataStage
connection to SAP from the list of available connections. This list is maintained on
the DataStage server. Each connection contains all the information needed to
connect to a particular R/3 system. Logical connections are shared with the IDoc
Extract for SAP R/3 component.
Use the following components on this tab to connect to an SAP R/3 system:
• DataStage Connection to SAP. The DataStage connection to the SAP R/3
system that is defined on the DataStage server machine and shared by all
DataStage users connected to that machine. The fields in this area are read-
only and are obtained from the connection that you selected. For more
information, see “Selecting the DataStage Connection to SAP” on page 8.
– Name. The name of the selected connection to the SAP R/3 system that
generates the IDocs to be loaded by this stage. The logical connection
includes RFC client logon details.
– Select… . Click to choose a DataStage connection to an existing SAP R/3
system or create a new one. The Select DataStage Connection to SAP
dialog appears (see “About the Select DataStage Connection to SAP
Dialog” beginning on page 4-7). The selected connection provides all
needed connection and default logon details that are needed to commu-
nicate with the corresponding SAP R/3 system. The default SAP Logon
Details are used if Use connection defaults is selected.
– Description. Additional information about the selected connection.
– Application Server. The name of the host system running R/3. If the
selected connection uses load balancing, Message Server is used instead.
– System Number. The number assigned to the SAP R/3 system used to
connect to R/3. If the selected connection uses load balancing, System ID
is used instead.
• SAP Logon Details. The fields in this area are read-only unless you clear
the Use connection defaults box.
– User Name. The user name that is used to connect to SAP.

4-6 PACK for SAP R/3


PACK4SAPR3.book Page 7 Thursday, January 15, 2004 11:31 AM

– Password. A password for the specified user name.


– Client Number. The number of the client system used to connect to SAP.
– Language. The language used to connect to SAP.
– Use connection defaults. Clear to remove the default SAP Logon Details
settings so you can use different logon information for only this stage in
this job. If selected, the displayed logon details are obtained from the
selected connection and are disabled.
• Enter an optional description to describe the purpose of the stage in the
Description field.
• Validate All. Select to check the stage and link properties and the column
lists for consistency and completeness.
For a newly created stage, the connection for the stage defaults to the last one you
selected for another stage of this type. If only one DataStage connection to an SAP
R/3 system is defined on the DataStage server machine, a new stage defaults to
that connection regardless of whether you previously created an IDoc Load for
SAP R/3 stage.

About the Select DataStage Connection to SAP Dialog


When you click Select on the General tab of the Stage page, the Select DataStage
Connection to SAP dialog opens.
This dialog lets you choose a DataStage connection to the SAP R/3 system. The
selected connection provides all needed connection and default logon details that
are needed to communicate with the corresponding SAP R/3 system.

The IDoc Load Plug-In 4-7


PACK4SAPR3.book Page 8 Thursday, January 15, 2004 11:31 AM

New…, Properties…, and Remove are administrative connection operations.


They let you manage the list of connections that is maintained on the DataStage
server machine from the Select DataStage Connection to SAP dialog:
• New… . Click to open the Connection Properties dialog, which lets you
define the properties for the new connection. Here you can specify SAP
connection details and default logon details for the new connection. The
Connection and Logon Details page appears by default.

• Properties… . Click to open the Connection Properties dialog, showing the


properties of the selected connection. This is the same dialog that is opened
when you click New…, but in this context the connection name is read-
only.
• Remove. Click to delete the selected connection after your confirmation.
These administrative operations function similarly to the corresponding buttons
on the DataStage Connections to SAP page of the DataStage Administrator for
SAP utility (see ”The Administrator for SAP Utility” beginning on page 6-1).
About the Connection and Logon Details Page
Use Properties… to open the Connection and Logon Details page of the
Connection Properties dialog, showing the properties of the selected connection.
This is the same dialog that is opened when you click New…, but in this context
the connection name is read-only.

4-8 PACK for SAP R/3


PACK4SAPR3.book Page 9 Thursday, January 15, 2004 11:31 AM

About the IDoc Load Settings Page


Click IDoc Load Settings on the Connection Properties dialog to complete the
configuration for loading IDocs.

The IDoc Load Settings page contains the following components:


• Partner Number to Use when Sending IDocs. This is the logical name
which represents DataStage as the source of IDocs. Specify the case-sensi-
tive name that corresponds to the Partner number you created to represent
DataStage in the Partner profile in SAP.
• Destination Partner Number to Use when Sending IDocs. This is the
logical name which represents the SAP system as the target where
DataStage sends IDocs. Specify the case-sensitive name that corresponds to
the Partner number in the SAP Partner profile.
• Location for Temporary Segment Index Files. For example, D:\Ascen-
tial\DataStage\DSSAPConnections\connection name where connection
name is the name of the DataStage connection to SAP on the Select
DataStage Connection to SAP dialog.
• Use default. Selected by default. If cleared, Browse… is enabled.
• Browse… lets you find another location.

The IDoc Load Plug-In 4-9


PACK4SAPR3.book Page 10 Thursday, January 15, 2004 11:31 AM

Selecting IDoc Types


After you provide a DataStage connection to SAP on the General tab of the Stage
page, you specify the type of the IDocs to be loaded on the IDoc Type tab of the
Stage page.

To configure an IDoc type:


1. Click Select… beside the IDoc Type field on the IDoc Type tab of the Stage
page to open the Select IDoc Type dialog for selecting an IDoc type.

4-10 PACK for SAP R/3


PACK4SAPR3.book Page 11 Thursday, January 15, 2004 11:31 AM

All available, released IDoc types, both basic and extended types, that are
defined on the R/3 system are visible, for example:

Connection displays the connection name with its description for the R/3
system whose IDoc types are being shown.
IDoc Types lists all released IDoc types defined on the R/3 system.
2. Click Find… on the Select IDoc Type dialog to open the Find IDoc Type
dialog. It lets you search for IDoc Types that contain user-specified
substrings in their name or description.
3. Select a type, and click OK to set as the IDoc type for the stage.

The IDoc Load Plug-In 4-11


PACK4SAPR3.book Page 12 Thursday, January 15, 2004 11:31 AM

The segment hierarchy (that is, the selected IDoc type and its component segment
types) is now visible on the IDoc Type tab of the Stage page:

About the Stage IDoc Type Page


The IDoc Components area on the IDoc Type tab of the Stage page shows the
segments defined for the IDoc type with their descriptions. This area contains the
following information:
• Name. Shows the hierarchical relationship among the segments using a
tree structure, with their descriptions.
(A segment type can appear only once within an IDoc type. The names for
the segments are the segment definition names, not the segment type
names. You can infer the segment type name from the segment definition
name.)
• Assigned Input Link. After particular segments in the IDoc Type are
assigned to the input links of the stage, the IDoc Components control
shows the names of the links in the Assigned Input Link column of the
control. This gives you an overview of which segments are being loaded by
the stage.
Click Select to open the Select Message Type for IDoc dialog to choose a message
type from the available types. Message Type defaults to the first available message
type for the selected IDoc type.

4-12 PACK for SAP R/3


PACK4SAPR3.book Page 13 Thursday, January 15, 2004 11:31 AM

Defining Character Set Mapping


You can define a character set map for a plug-in stage. Do this from the NLS tab
that appears on the Stage page. The NLS page appears only if you have installed
NLS.
Specify information using the following components:
• Map name to use with stage. The default character set map is defined for
the project or the job. You can change the map by selecting a map name
from the list.
• Use Job Parameter… . Specifies parameter values for the job. Use the
format #Param#, where Param is the name of the job parameter. The string
#Param# is replaced by the job parameter when the job is run.
• Show all maps. Lists all the maps that are shipped with DataStage.
• Loaded maps only. Lists only the maps that are currently loaded.
For more information about NLS or job parameters, see DataStage
documentation.

Defining IDoc Load Input Links


After you select an IDoc type, you assign a segment type to each input link.

The IDoc Load Plug-In 4-13


PACK4SAPR3.book Page 14 Thursday, January 15, 2004 11:31 AM

Begin the selection process from the General tab on the Input page:

The Input page has an Input name field, the General and Columns pages, and the
Columns… button. The Input link pages are almost identical to the Output link
pages of the IDoc Extract for SAP R/3 stage. They display the fields to load.
• Input name. The name of the input link. Choose the link you want to edit
from the Input name list box.
• Click the Columns… button to display a brief list of the columns desig-
nated on the input link. As you enter detailed meta data in the Columns
page, you can leave this list displayed.
After you choose the input link from the Input name box, the following steps
summarize how input links function. The subsequent sections describe these steps
in detail.
1. Click Select… beside IDoc Component to Load on the General tab of the
Input page to open the Select IDoc Component to Load dialog (see
“Selecting the IDoc Component to Load” on page 4-15). The IDoc Compo-
nent to Load, Description, and Fields in Component information is
displayed after you select a segment for the link.
2. The selected segment type and its fields to load now appear on the
General tab of the Input page.

4-14 PACK for SAP R/3


PACK4SAPR3.book Page 15 Thursday, January 15, 2004 11:31 AM

3. Columns are automatically generated for each link when the segment
type is selected (see “Modifying Columns” on page 4-16).
4. You can delete unnecessary columns.
5. Columns for missing fields can be added by clicking Add Columns on
the Columns tab of the Input page (see “Modifying Columns” on
page 4-16).
The following sections describe how input links function.

Selecting the IDoc Component to Load


When you click Select on the General tab of the Input page, the Select IDoc
Component to Load dialog appears. This dialog contains an IDoc Components
control identical to the one on the IDoc Type tab of the Stage page.
You select the segment definition for each link from the Select IDoc Component
to Load dialog.
The most recent released version of the IDoc and its segments are used.

1. Select a segment type and click OK.

The IDoc Load Plug-In 4-15


PACK4SAPR3.book Page 16 Thursday, January 15, 2004 11:31 AM

2. The dialog closes, and you return to the General tab of the Input page
with the IDoc Component to Load, Description, and Fields in Compo-
nent information now shown, for example:

Modifying Columns
When you choose a segment from the Select IDoc Component to Load dialog, a
corresponding list of columns for the link including key columns is automatically
generated.
P-keys (primary keys) are named after the segment type. F-keys (foreign keys)
relate records for input links to parent records. As a result, meta data is
synchronized throughout the links when changes are made to it. F-keys and P-keys
are automatically updated, giving users flexibility in using data from different
columns in the source data.

4-16 PACK for SAP R/3


PACK4SAPR3.book Page 17 Thursday, January 15, 2004 11:31 AM

You can view and modify these generated columns using the Columns tab on the
Input page as in the following example:

The Columns tab of the Input page has the following components:
• The Corresponding Load Field for Column "xx" is the single-row grid near
the bottom of the tab showing the load field that corresponds to the
selected column. (The value changes depending on the selected column.) It
includes Name, Description, Internal Type, and Length information. In this
screen, for example, no values are displayed since a key column was
selected with no corresponding segment field.
• Click Add Columns… to add columns for any fields that are not currently
represented in the columns list. The Add Columns… dialog appears where
you can select from the available load fields for columns.
You can also delete any unneeded columns.
• Click Validate Columns to check the column list for consistency with the
segment fields.
• Key Columns… . Click to open the Key Columns for Link dialog. This lets
you change the number of columns that represent the primary and foreign
key for the link (see “Synchronizing Columns” on page 20).

The IDoc Load Plug-In 4-17


PACK4SAPR3.book Page 18 Thursday, January 15, 2004 11:31 AM

• Save… . Click to open the Save IDoc Segment Definition dialog to save
the meta data definitions when you finish defining the columns for the
link.

About Columns and Segments


The first column in the list on the Columns tab of the Input page,
(IDOC_MATMAS03_P_KEY in the previous example,) represents the primary key
for the IDocs.
Any generated value can be used as the value for the column, but for each value of
the key, there must be exactly one record that is received through the link.
Select a corresponding segment type for the rest of the input links the same way as
previously described. As each segment type is being selected using the Select IDoc
Component to Load dialog, segment types that have already been assigned a link
are indicated in the Assigned Input Link column. See the following example,
where the assigned link is DSLink9 is for E1PLOGI.

Parent and Child Segment Types


Generally, you should assign links to parent segment types before assigning links
to the child segment types.

4-18 PACK for SAP R/3


PACK4SAPR3.book Page 19 Thursday, January 15, 2004 11:31 AM

Use Validate All on the General tab of the Stage page to detect when a link is
assigned to a segment, but the links are not assigned to their ancestors. If you try
to assign a segment type whose parent segment type has not been assigned to some
other link, you are warned, but you can continue.
Root Segments. When a segment type is assigned to a link, the automatically
generated column list includes primary and foreign key columns. When the
selected segment type is a root segment type (that is, one that has no parent
segment type in the IDoc), the foreign key column represents the primary key for
the IDoc as a whole.
Non-Root Segments. If a non-root segment type is selected, the resulting
generated foreign key column represents the value of the primary key of the parent
segment type.

Synchronizing Columns
It is sometimes more convenient to use more than one column for a particular key.
Use Key Columns… to open the Key Columns for Link dialog. This lets you
change the number of columns that are used to represent the primary and foreign
key for the link. If you change the primary key to two, for example,

and click OK, the column list is refreshed, generating as many columns for each
key as requested.

The IDoc Load Plug-In 4-19


PACK4SAPR3.book Page 20 Thursday, January 15, 2004 11:31 AM

You see a total of three key columns in the following example:

Changing the number of columns that are used for a primary key also causes the
column lists for links that include a corresponding foreign key to be updated.
Likewise, changing the number of columns used for a foreign key causes the
corresponding primary key columns to be updated in the link representing the
parent segment type.

4-20 PACK for SAP R/3


PACK4SAPR3.book Page 1 Thursday, January 15, 2004 11:31 AM

5
The BAPI Plug-In

Introduction
This chapter describes the following for the BAPI plug-in. This plug-in is part of
version 5 of the Ascential PACK for SAP R/3, for DataStage 6.0 or later:
• Functionality
• Installation prerequisites
• Building a job
• Loading data into SAP R/3
• Extracting data from SAP R/3
DataStage provides enhanced SAP support with its Packaged Application
Connection Kit (PACK) for SAP R/3.
SAP created the Business Framework, which is an open, component-based
product architecture. It allows the technical integration and exchange of business
data among R/3 SAP components and between SAP and non-SAP components.
Business Objects and their BAPIs are important components of the Business
Framework, which are used by the plug-in to move data across SAP and non-SAP
components in a standard way.
A Business Object is the representation of a business entity, such as an employee or
a sales order, in the SAP R/3 system.
BAPIs are standard SAP interfaces that are used by the plug-in to integrate with
SAP R/3 systems. They are technically implemented using function modules that
are RFC-enabled (Remote Function Call) inside SAP systems.
The availability of these BAPIs as object-oriented interfaces lets other components
directly access the application layer of an SAP system without implementation
details. BAPIs allow integration with SAP at the business level.
You can use the BAPI plug-in to interface with the my-SAP Business Suite, which
is a family of solutions and integrated application platforms, such as CRM, SCM,

The BAPI Plug-In 5-1


PACK4SAPR3.book Page 2 Thursday, January 15, 2004 11:31 AM

and so forth that are supplied by SAP. Use the BAPI plug-in within DataStage as a
passive stage, for example, to load data into and extract data from SAP R/3 Enter-
prise as the target system. Do this by using the library of Business Application
Programming Interfaces (BAPIs), which is provided and maintained by SAP. The
databases always remain in a consistent state after executing the BAPIs
(methods).
The BAPI plug-in lets you use these BAPIs to design load or extract jobs. You can
do this because the plug-in captures the meta data for each BAPI and dynamically
builds the complex RFM call to execute these BAPIs.
You can use the BAPI plug-in GUI to select a Business Object and its BAPI from
the SAP R/3 system and display the corresponding interface which you can use
to load or extract data. The run-time component for the plug-in loads or extracts
data into or from the SAP application server.
See “Terminology” on page 1-7 for a list of the terms used in this chapter.

Functionality
The BAPI plug-in has the following functionality:
• Lets you choose and define SAP connections using the GUI.
• Lets you explore the SAP BOR, dynamically choose any BAPI, and store its
meta data in the job.
• Lets you view the BAPI interface using a function module (RFC) with its
import, export, and table parameters, and optionally decide which parame-
ters to use when executing the BAPI.
• Works with the my_SAP Business Suite products.
• Loads and extracts data into and from SAP R/3 using BAPIs with appro-
priate log information.
• Supports NLS (National Language Support). For information, see
DataStage Server Job Developer’s Guide.
The following functionality is not supported:
• The creation of custom BAPIs using the plug-in.
• Data transformations or mappings. Use the Transformer stage to do this.
• Testing of the BAPI during design time.

5-2 PACK for SAP R/3


PACK4SAPR3.book Page 3 Thursday, January 15, 2004 11:31 AM

The functionality of the BAPI plug-in is represented in the following diagram:

The links do the following:


1. Link 1. The GUI client logs on to the SAP R/3 application server and retrieves
meta data information from BOR.
2. Link 2. The GUI client stores BAPI meta data in the DataStage Repository.
3. Link 3. The run-time server reads the meta data from the DataStage Reposi-
tory and dynamically builds a BAPI call.
4. Link 4. The run-time server makes a BAPI call and processes the generated
dataset.
5. Link 5. The returned values are written to the appropriate logs.
In summary, the plug-in can dynamically call any BAPI and process the returned
data appropriately, based on whether it is a load or an extract BAPI.

Building a Job
The BAPI plug-in stage can be a data source and a data destination. You can add
links into the stage and out from the stage. Multiple links are allowed in both
directions that let this stage call multiple BAPIs in the same job to load and extract
data at the same time.

The BAPI Plug-In 5-3


PACK4SAPR3.book Page 4 Thursday, January 15, 2004 11:31 AM

If the job uses multiple input or output links, each link must define a unique
BAPI, that is, a link cannot contain a BAPI that is already used in another link in
the same job.
To build a job:
1. Create a DataStage job using the DataStage Designer.
2. Create the BAPI plug-in stage, adding the stages and links you need to load
and extract data. Double-click the BAPI plug-in stage icon to open the stage
editor dialog (GUI).
3. Define the stage properties.
4. Define the SAP R/3 connection details and the SAP logon information.
5. Specify the properties for input links for loading data to SAP R/3.
6. Specify the properties for output links for extracting data from SAP R/3.
7. Compile the job.
Each task is described in more detail in the following sections.

Defining Stage Properties


When you open the BAPI plug-in stage, the BAPI_PACK_for_R3 Stage dialog
opens. The BAPI plug-in for release 5 of the Ascential PACK for SAP R/3 uses this
icon:

5-4 PACK for SAP R/3


PACK4SAPR3.book Page 5 Thursday, January 15, 2004 11:31 AM

The Stage page for the BAPI plug-in appears by default:

This dialog has the following pages, depending on your links:


• Stage. Displays the name of the stage you are editing. You can define
DataStage connection and logon details to a target or source SAP system.
You can also describe the purpose of the stage in the Description field of
the General tab. This page is similar to that in other R/3 and BW plug-ins
except the connection information is stored at the stage level, not at the link
level. This enables bi-directional links.
The NLS tab defines a character set map to use with the stage, if NLS is
enabled. For details, see “Defining Character Set Maps” on page 5-5.
• Input. Has an Input name field, the General, BAPI, Logs, and Columns
pages, and the Columns… button.
• Output. Has an Output name field, the General, BAPI, Read Logs, and
Columns pages, and the Columns… button.

Defining Character Set Maps


You can optionally define a character set map for a BAPI stage using the NLS tab.
You can change the default character set map defined for the project or the job by
selecting a map name from the list.

The BAPI Plug-In 5-5


PACK4SAPR3.book Page 6 Thursday, January 15, 2004 11:31 AM

This tab also has the following components:


• Show all maps. Lists all the maps supplied with DataStage. Maps cannot
be used unless they are loaded using the DataStage Administrator.
• Loaded maps only. Displays the maps that are loaded and ready to use.
• Use Job Parameter… . Lets you specify a character set map as a parameter
to the job containing the stage.
• Allow per-column mapping. Lets you specify character set maps for indi-
vidual columns within the table definition. If per-column mapping is
selected, an extra property, NLS Map, appears in the grid in the Columns
tab.
For more information about NLS or job parameters, see DataStage
documentation.

Defining SAP Connection and Logon Details


The BAPI plug-in can use and share SAP connections created by other R/3 plug-
ins you create, such as IDoc Load, IDoc Extract, and ABAP. For security reasons,
the logon details for the BAPI plug-in are not stored at the server level.
Enter the following information on the General tab:
• DataStage Connection to SAP. The DataStage connection to the SAP R/3
system that is defined on the DataStage server machine and shared by all
DataStage users connected to that machine. The fields in this area are read-
only and get default values from the last connection that you used.
Name. The name of the selected connection to the SAP R/3 system that
generates the data to be extracted.
Select… . Click to choose a DataStage connection or define a new
connection to the SAP R/3 system. The selected connection provides all
needed connection details to communicate with the corresponding SAP R/3
system. This opens the Select DataStage Connection to SAP dialog (see the
following section on page 5-7).
Description. Additional information about the selected connection.
Application Server. The name of the host system running R/3.
System Number. The number assigned to the SAP R/3 system used to
connect to R/3.

5-6 PACK for SAP R/3


PACK4SAPR3.book Page 7 Thursday, January 15, 2004 11:31 AM

• SAP Logon Details. Provides all the necessary information to logon to SAP,
which gets stored in the job.
User Name. The user name that is used to logon to SAP.
Password. A password for the specified user name.
Client Number. The number of the client system used to logon to SAP.
Language. The language used to logon to SAP.
• Description. Enter text to describe the purpose of the stage.

Selecting the DataStage Connection to SAP


Click Select… on the General tab of the Stage page to open the Select DataStage
Connection to SAP dialog. It lets you view previously-defined connections to
SAP systems and choose a DataStage connection to the SAP R/3 system. The
selected connection provides all required connection details to communicate with
the corresponding SAP R/3 system. You can create and define new connections
and delete existing connections.
Since the information is stored on the DataStage server, it lets you share connec-
tions across several DataStage clients that you created, for example:

New…, Properties…, and Remove are administrative connection operations,


similar to those for other R/3 plug-ins with these pages. They let you manage the
list of connections that is maintained on the DataStage server machine from the
Select DataStage Connection to SAP dialog. They function in the same way as
the corresponding buttons on the DataStage Connections to SAP page of the
DataStage Administrator for SAP utility.

The BAPI Plug-In 5-7


PACK4SAPR3.book Page 8 Thursday, January 15, 2004 11:31 AM

For more information about creating or defining new connections and deleting
existing connections, see the following section and “The Administrator for SAP
Utility” on page 6-1.

Defining Connection Properties


Click Properties… on the Select DataStage Connection to SAP dialog to open the
Connection Properties dialog. Use this page to specify a name, which is read-
only, description, and SAP connection details for the new connection. You can
also modify an existing connection here.

The DataStage Connections to SAP page of the DataStage Administrator for


SAP utility lets you specify a non-load balancing connection even if load
balancing is specified in the connection configuration.
This page has the following components:
• Application Server. The name of the R/3 server.
• System Number. The system number of the R/3 instance.
• Router String. Optional. The string used to connect to the remote SAP
server. If you use a SAP router to access your system, you must include a
fully-qualified node name for your RFC server machine in the SAP router
table.

5-8 PACK for SAP R/3


PACK4SAPR3.book Page 9 Thursday, January 15, 2004 11:31 AM

• Use load balancing. Select to use load balancing when connecting to R/3.
Client connections are made through a message server rather than an
application server.
The Application Server and the System Number controls are replaced by
Message Server, System ID, and Group controls so that connection details
specific to load balancing can be entered.

Defining Input Properties


Begin the selection process from the General tab on the Input page. When you
click the Input tab, the General tab appears by default, displaying the fields used
for loading data:

The Input page has an Input name field, the General, BAPI, Logs, and Columns
pages, and the Columns… button.
• Input name. The name of the input link. Choose the link you want to edit
from the Input name list.
• Click the Columns… button to briefly list the columns designated on the
input link. As you enter detailed meta data in the Columns page, you can
leave this list displayed.

The BAPI Plug-In 5-9


PACK4SAPR3.book Page 10 Thursday, January 15, 2004 11:31 AM

Input General Page


This resizable tab has the following read-only components (except Description):
• Business Object. The name of the Business Object in the SAP R/3 system
used to load data.
• Method. The name of the BAPI in the SAP R/3 system you are using.
• Short Text. The text that describes the selected BAPI.
• BAPI Explorer. Click to open the Specify Filter Criteria for BAPIs dialog
(see the following section “Filtering Business Objects”). It lets you explore
the SAP BOR.
• Description. Enter text to describe the purpose of the link.
After you select a BAPI on the BAPI Explorer dialog (see “BAPI Explorer Dialog”
on page 5-11), the General tab automatically displays the selected Business
Object, BAPI, and short description of the selected BAPI.

Filtering Business Objects


When you click BAPI Explorer on the General tab of the Input or Output page,
the Specify Filter Criteria for BAPIs dialog appears first. The dialog lets you
select an ID to filter the business objects that are based on SAP components.

Note: This interim dialog appears only for release 4.6 and later of SAP. Other-
wise, the BAPI Explorer screen opens.

This dialog has the following components:

• Application Component ID. You can select a specific component from the
list or use <All Components> to select all components from the list.

5-10 PACK for SAP R/3


PACK4SAPR3.book Page 11 Thursday, January 15, 2004 11:31 AM

After you select a component ID, this dialog contains only those Business
objects that belong to the selected application component. The list of
available application components to choose from is dynamically built,
depending on the accessed system.
• BAPI’s to Display. You can show only released BAPIs (the default) or all
BAPIs in the BAPI Explorer screen. You must ensure unreleased BAPIs are
complete and ready to use, or a warning appears. (Click Show All BAPIs
to see those BAPIs that are unreleased.)
• OK. Click to open the BAPI Explorer dialog.
• Cancel. Click to return to the Input or Output General page.

BAPI Explorer Dialog


Click OK on the Specify Filter Criteria for BAPIs dialog to open the BAPI
Explorer dialog. This lets you browse the SAP BOR to select a BAPI.
Example for input link. The following example lists the available Business
Objects in a tree structure under the root.
You can further expand each node that displays the objects to list the BAPIs for
that object.

The BAPI Plug-In 5-11


PACK4SAPR3.book Page 12 Thursday, January 15, 2004 11:31 AM

In this example, the standard Bank Business Object is selected, and a list of BAPIs
is displayed for the expanded node.

Similarly, you can further expand nodes that show BAPIs to list parameters that
are defined for that BAPI.
Click Cancel to return to the calling dialog without selecting a BAPI.
Click OK to select a BAPI. OK is enabled only when the selected node in the tree
is the BAPI type. This indicates that you are selecting a BAPI, not another object in
the tree.
When you select a BAPI, the relevant fields on the Input page are populated with
the meta data.

5-12 PACK for SAP R/3


PACK4SAPR3.book Page 13 Thursday, January 15, 2004 11:31 AM

Example for output links. The following example shows an equivalent screen
for an output link with Bank expanded:

Defining BAPI Interfaces


The BAPI tab of the Input page displays the interface for the selected BAPI. Use
this tab to design the run-time behavior of the selected BAPI.
You can view the parameters or attributes for the selected BAPI on the Import
(the default), Tables, and Export tabs. Fields that are required for import and table
parameters appear on the Columns page.
For information about parameters, see “Run-Time Component” on page 5-26.

The BAPI Plug-In 5-13


PACK4SAPR3.book Page 14 Thursday, January 15, 2004 11:31 AM

Input BAPI Import Page

When a BAPI is first selected, only required parameters are active on the BAPI
Import tab of the Input page. For input links, fields that are required for import
parameters appear on the Columns page.
You can tell whether a parameter is active by the following indications:
• Green icons beside the parameter names indicate that the parameters are
active, that is, used to dynamically build BAPI calls at run time. (Red icons
indicate that parameters are inactive, unused when calling BAPIs.)
• Green icons display I or E to indicate whether table parameters are acti-
vated for Import or Export.
To activate or de-activate parameters, do one of the following:
• Double-click a parameter name
• Right-click for a shortcut menu
When you move the cursor over a parameter, its shape changes, indicating that
you can activate or deactivate parameters.

5-14 PACK for SAP R/3


PACK4SAPR3.book Page 15 Thursday, January 15, 2004 11:31 AM

Input BAPI Tables Page


Click the Tables tab on the Input BAPI page to view the table parameters of a
BAPI.

Fields that are required for tables parameters appear on the Columns page.

Input BAPI Export Page


Click the Export tab on the Input BAPI page to view the export parameters of a
BAPI. By default, export parameters are optional and initially inactive.

The BAPI Plug-In 5-15


PACK4SAPR3.book Page 16 Thursday, January 15, 2004 11:31 AM

Fields that are required for export parameters appear in a grid on the Input Logs
page, not on the Input Columns page, because they contain return values from
the BAPI call.

You can see the reference structure being used for each parameter with a short
text description of the parameter. This helps you decide whether to activate a
particular parameter.

Defining Log Files


Click the Logs tab of the Input BAPI page to view information about the return
values for exported fields from the BAPI call and the directory where these values
get stored.

5-16 PACK for SAP R/3


PACK4SAPR3.book Page 17 Thursday, January 15, 2004 11:31 AM

The Logs tab appears:

This tab has the following components:

• Location for Return Parameters and other Log Files. By default, you see a
pathname whose directory corresponds to the selected BAPI. This directory
for temporary log files is created under the DataStage\DSSAPConnec-
tions directory.
In this example, StandardMaterial.SaveData is the selected BAPI for the
stage. Assuming the connection name is C46, the default directory for return
values is:
<DSHOME>\DSSAPConnections\C46\BAPIs\Loads\StandardMaterial.SaveData

where <DSHOME> is the DataStage home directory.


• Browse… . Click to open the Browse directories dialog to find existing
directories to store the values. (Clear Use default directory.)
• Use default directory. Clear to use Browse… to search for another direc-
tory to store the values or to enter the pathname in Location for Return
Parameters and other Log Files.
• Grid. Displays the fields that correspond to the parameters selected in the
Export tab on the Input BAPI page.

The BAPI Plug-In 5-17


PACK4SAPR3.book Page 18 Thursday, January 15, 2004 11:31 AM

Input Columns Page


This page stores meta data for all the input fields to this link.

All fields in the grid except the first (BAPISeqno) are derived from reference struc-
tures for the import and tables parameters that get activated on the Input BAPI
page.
BAPISeqNo is automatically generated by the GUI to let the run-time component
distinguish between parent and child rows in an input dataset that represents a
header-detail relationship. For example, if the dataset contains a header record
with multiple child records, this field contains the same value for each of those
rows.
As for other plug-ins, you can use the following components for the column
definitions:
• Save… . Click to open the Save Table Definition dialog to save column
definitions to the DataStage Repository.
• Load… . Click to open the Table definitions dialog to load the column
definitions from the DataStage Repository into other stages in your job
design.

5-18 PACK for SAP R/3


PACK4SAPR3.book Page 19 Thursday, January 15, 2004 11:31 AM

Defining Output Properties


When you click the Output tab, the General tab appears by default:

This resizable screen looks exactly like the Input General page, except that the
selected BAPI extracts data from SAP.
The Output page has an Output name field, the General (the default), BAPI,
Read Logs, and Columns pages, and the Columns… button.
• Output name. The name of the output link. Choose the link you want to
edit from the Output name list.
• Columns… . Click to briefly list of the columns designated on the output
link. As you enter detailed meta data in the Columns page, you can leave
this list displayed.

Output General Page


This screen has the following read-only components (except Description):
• Business Object. The name of the selected Business Object.
• Method. The name of the BAPI you use for extracting data. After you select
a BAPI, the page automatically displays the related information for the
other fields.

The BAPI Plug-In 5-19


PACK4SAPR3.book Page 20 Thursday, January 15, 2004 11:31 AM

• Short Text. The text that describes the selected BAPI.


• BAPI Explorer. Click to open the Specify Filter Criteria for BAPIs dialog
to explore the SAP BOR (see “Filtering Business Objects” on page 5-10).
• Description. Enter text to describe the purpose of the link.
After you select a BAPI on the BAPI Explorer dialog (see “BAPI Explorer Dialog”
on page 5-11), the General tab of the Output page automatically displays the
selected Business Object, BAPI, and short description of the selected BAPI.

Output BAPI Page


Click the BAPI tab of the Output page to open the Import tab by default. You can
view parameters or attributes of the BAPI on the Import, Tables, or Export page,
depending on their type.
This page displays the attributes for the selected BAPI. As for the Input BAPI
page, you can design and determine the run-time behavior of the selected BAPI.

Output BAPI Import Page

When you first select a BAPI, only required parameters are active.

5-20 PACK for SAP R/3


PACK4SAPR3.book Page 21 Thursday, January 15, 2004 11:31 AM

The color of the icons beside parameter names indicate whether they are active:
• Green. Indicate active parameters that are used when dynamically
building the BAPI call at runtime.
• Red. Indicate inactive parameters that are unused when calling the BAPI.
Double-click a parameter name or right-click to open a shortcut menu to activate
or deactivate parameters. When you move the cursor over a parameter, its shape
changes, indicating whether you can activate or deactivate parameters.

Output BAPI Tables Page


The following example shows the Output BAPI Tables page with the shortcut
menu.
You can activate or de-activate table parameters for Import or Export. When
values need to be passed to SAP using table parameters, you can activate them for
Import (or Input). When values need to be extracted from SAP, you can activate
table parameters for Export.

Use the shortcut menu to activate table parameters:


• Activate Selected Parameter. By default, if you activate a table parameter
by double-clicking, it is activated for Export.
• Activate Selected Parameter for Input. Indicates that you can activate a
table parameter for Input only and use the short-cut menu.

The BAPI Plug-In 5-21


PACK4SAPR3.book Page 22 Thursday, January 15, 2004 11:31 AM

A green icon displays I or E to indicate whether the table parameter is activated


for Import or Export, for example:

Output BAPI Export Page


Export parameters, by default, are optional (initially inactive). The following
screen displays an example of the Output BAPI Export page with active
parameters.

5-22 PACK for SAP R/3


PACK4SAPR3.book Page 23 Thursday, January 15, 2004 11:31 AM

You can see the reference structure being used for each parameter with a short
text description of the parameter to help you decide whether to activate a partic-
ular parameter.

For output links, the following parameters appear on the Columns page:
• Parameters activated on the Export tab
• Tables parameters that are activated for extracting values
Fields for import parameters and the table parameters activated for Input Import
appear on the Output Read Logs page. This is because these fields are not
extracted when the job runs but need to be read (loaded) to call a BAPI.

Output Read Logs Page


This page contains information about the imported fields for the BAPI call and the
directory from which these values get loaded.

The BAPI Plug-In 5-23


PACK4SAPR3.book Page 24 Thursday, January 15, 2004 11:31 AM

The grid displays those fields that correspond to the parameters selected on the
Import tab on the Output BAPI page and the fields that correspond to parameters
activated on the Tables tab for Input on the Output BAPI page.

The locations for import parameters and other log files are specified on this
screen. By default, a directory name that corresponds to the selected BAPI that
gets created in the DataStage\DSSAPConnections directory appears.
For example, if the selected BAPI for the stage is PurchaseOrder.GetDetails and
the connection name is C46, the default directory for import values is:
DSHOME\DSSAPConnections\C46\BAPIs\Extracts\PurchaseOrder.GetDetails

where <DSHOME> is the DataStage home directory.


You can specify another directory to store these import values by clearing Use
defaults and browsing for existing directories or by entering a value in the edit
control.

5-24 PACK for SAP R/3


PACK4SAPR3.book Page 25 Thursday, January 15, 2004 11:31 AM

Click Browse beside Location for Log Files to open the Browse directories dialog
to select a directory.

You can also click Browse beside Input File for Import Parameters to open a
dialog to select a file.

Output Columns Page


This page stores meta data for all output fields for this link. All fields in the grid
except the first (BAPISeqno) are derived from reference structures for the export
and tables parameters that get activated on the Output BAPI page.

The BAPI Plug-In 5-25


PACK4SAPR3.book Page 26 Thursday, January 15, 2004 11:31 AM

An example of the Output Columns page follows:

The BAPISeqNo field is generated automatically so the run-time component can


distinguish between parent and child rows in an output dataset. They represent a
header-detail relationship. For example, if the dataset contains a header record
with multiple child records, the field for each of those rows contains the same
value.

Completing the Job


Complete the definition of the other stages in your job according to normal
DataStage procedures. Compile and run the job.

Run-Time Component
The BAPI plug-in run-time component is a modified sequential file stage that
supports multiple links in both directions. It dynamically builds a call to the
selected BAPI and executes that call.
Input links. The return values from the BAPI call are stored in the appropriate
directory that you specify on the Input Logs page.

5-26 PACK for SAP R/3


PACK4SAPR3.book Page 27 Thursday, January 15, 2004 11:31 AM

The name of the log files corresponds to the parameter names that are activated
for the BAPI, for example:
• Export parameters. If an export parameter named PURCHASEORDER is
activated for the BAPI, a log file named PurchaseOrder.txt is created in the
Export directory in the directory specified on the Input Logs page.
• Tables parameters. You can use table parameters to import or export
values. Two log files are created for each parameter, which contain the
following:
– The parameter name with the .TXT suffix
– The parameter name with the __RETURN.TXT suffix
For example, for the table parameter named DESCRIPTION, log files named
DESCRIPTION.TXT and DESCRIPTION_RETURN.TXT are created.
Output links. The job expects import values to be available in the Import direc-
tory that gets created under the pathname specified on the Output Read Logs
page. The name of the log files corresponds to the parameter names that are acti-
vated for the BAPI, for example:
• Import parameter. If an import parameter named PURCHASEORDER is
activated for the BAPI, a log file named PurchaseOrder.txt is expected to be
in the Import directory in the pathname specified on the Output Read
Logs page.
• Tables parameters. You can use table parameters to import or export
values. two log files get created for each active parameter, which contain
the following:
– The parameter name with the .TXT suffix
– The parameter name with the __Return suffix
For example, if the table parameter named DESCRIPTION is active, DESCRIP-
TION.TXT and DESCRIPTION_RETURN.TXT log files are created.

The BAPI Plug-In 5-27


PACK4SAPR3.book Page 28 Thursday, January 15, 2004 11:31 AM

5-28 PACK for SAP R/3


PACK4SAPR3.book Page 1 Thursday, January 15, 2004 11:31 AM

6
The Administrator for SAP
Utility

DataStage Administrator for SAP Utility


Many of the features provided by the stage editor for managing and configuring
both R/3 connection properties and IDoc type properties are not associated with a
specific DataStage job but are more global in nature. It is convenient to have this
functionality available during job design using the stage editor because it is quite
probable that a new job might necessitate the configuration of a new connection
and a new IDoc to be received by that connection.
However, subsequent changes to these configurations have the potential to impact
many jobs. For example, the DataStage Administrator may determine that the
batch count for an IDoc is set too low. This can cause DataStage jobs that process
the IDoc to become backed up since they cannot complete before the next batch of
IDocs arrive. The Administrator can increase the batch count for this IDoc type,
and all jobs processing the IDoc type are affected by this change.
The Ascential PACK for SAP R/3 includes a stand-alone utility called DataStage
Administrator for SAP that lets you manage the configurations of the various R/3
connection and IDoc type objects that are beyond the scope of a DataStage job.
See “Terminology” on page 1-5 for a list of the terms used in this chapter.
You can invoke this utility from the Ascential DataStage program group on a
DataStage client machine to do the following:
• Define the connection to the SAP database.
• Monitor the activity of the IDoc listener.

The Administrator for SAP Utility 6-1


PACK4SAPR3.book Page 2 Thursday, January 15, 2004 11:31 AM

• Migrate connection and IDoc type configuration settings from one


DataStage installation to another.
• Modify connection and IDoc type configuration settings.
• Schedule cleanup of temporary IDoc files.
DataStage Administrator for SAP is a utility that contains the DataStage
Connections to BW, DataStage Connections to R/3, and IDoc Cleanup and
Archiving pages, depending on which plug-ins you install. They are described in
the following sections (see Ascential PACK for SAP BW: Part No. 00D-025DS60 for
details about the BW PACK).

About the DataStage Connections to R/3 Page


This page displays all the connections to SAP that are currently defined on the
DataStage server machine. Add…, Properties…, Import into…, Export, and
Remove let you manage the list of connections that is maintained on the DataStage
server machine. They function the same as the corresponding buttons on the Select
DataStage Connection to R/3 dialog (see “Selecting the DataStage Connection to
SAP” on page 5-7).

6-2 PACK for SAP R/3


PACK4SAPR3.book Page 3 Thursday, January 15, 2004 11:31 AM

The following steps let you define a DataStage connection to SAP. The subsequent
sections describe these steps in detail.
1. Run the DataStage Administrator for SAP on a DataStage client machine.
2. Click Add… ➤ New… to define the properties for the new connection.
3. Enter a connection name, a description, and SAP connection and logon
details.
4. Optionally, use load balancing.
5. Enter an IDoc listener program ID (the same ID must be defined for the
tRFC port in the SAP database).
6. Define the properties for running jobs automatically as IDocs arrive on
the DataStage Job Options for Idocs page.
7. Click Add to test the connection and logon properties and add the
connection. An IDoc listener server starts automatically after the new
connection is added.
8. Configure the SAP system to access DataStage by doing the following:
a. Log on to the R/3 system.
b. Create a tRFC port, and assign it the IDoc listener program ID that was set
for the connection.
c. Create a logical system to represent DataStage.
d. Attach the tRFC port to the logical system.
9. Click IDoc Types to see the IDoc types for the selected connection. It
opens the IDoc Types dialog where you can set the properties of IDoc
types.
The DataStage Connections to R/3 page has the following components:
• Add… . Click to open a popup menu with the New… and Import… items.
Do the following:
1. Click New… to open the Connection Properties dialog, which lets you
define the properties for the new connection. Here you can specify
Connection Name, Description, and SAP connection and default
logon details for the new connection. (See “Defining DataStage
Connections to R/3” on page 6-4.)

The Administrator for SAP Utility 6-3


PACK4SAPR3.book Page 4 Thursday, January 15, 2004 11:31 AM

2. Click Import … to open a standard Open File dialog that lets you
select the export file to be imported. This lets you import a new
connection (see “Importing New Connections” on page 6-11).
• Properties… . Click to open the Connection Properties dialog, showing the
properties of the selected connection. This is the same dialog that is opened
when you click New on the Select DataStage Connection to SAP dialog,
but in this context the connection Name is read-only, and the Add button is
replaced by an OK button.
• Import into… . Click to import IDoc type configurations into the selected
connection that already exists. A standard Open File dialog appears so you
can select a file to be imported (see “Importing IDoc Type Configurations”
on page 6-11).
• Export… . Click to save the configuration information for the selected
connection and all its associated IDoc types into a file (see “Exporting
Connections” on page 6-13).
• Remove. Click to delete the selected connection after your confirmation.
• IDoc Types. Click to see the IDoc types for the selected connection. It opens
the IDoc Types dialog (see the section “About the IDoc Types Dialog” on
page 6-14). This button does not appear on the Select DataStage Connec-
tion to SAP dialog.
• IDoc Log. Click to open the IDoc Log dialog. This dialog displays log
messages reported by the IDoc listener for the connection (see the section
“About the IDoc Log Dialog” on page 6-15). This button does not appear
on the Select DataStage Connection to SAP dialog.
If Listen for IDocs received through this connection is cleared on the IDoc
Listener Settings page, the Import Into, IDoc Types, and IDoc Log buttons on
this page are disabled when you select that connection. Also, if this same option is
cleared, Import and Export operations apply only to the properties for the connec-
tion. They do not involve IDoc Type configurations. (This is useful for connections
that are only used for ABAP or BAPI stages, rather than IDoc stages.)

Defining DataStage Connections to R/3


Click Properties… on the DataStage Connections to R/3 page of the DataStage
Administrator for SAP utility to open the Connection Properties dialog. This
dialog contains the Connection and Logon Details, IDoc Listener Settings, and
the DataStage Job Options for Idocs pages.

6-4 PACK for SAP R/3


PACK4SAPR3.book Page 5 Thursday, January 15, 2004 11:31 AM

The DataStage Connections to R/3 page lets the RFC Server make a non-load
balancing connection even if load balancing is specified in the connection
configuration and provides an option that prevents the Listener from sending
status updates back to the R/3 system when IDocs are received.

Connection and Logon Details Page


Use this page to specify a name, description, and SAP connection and default SAP
logon details for the new connection.
The Connection and Logon Details page that is similar to the following screen
appears. It opens by default. (Note that Group appears for this screen when Use
load balancing is set.)

To define the DataStage connection to SAP:


1. Specify the SAP connection details:
• Application Server. The name of the R/3 server.
• System Number. The system number of the R/3 instance.
• Router String. Optional. The string used to connect to the remote SAP
server. If you use a SAP router to access your system, you must include a
fully-qualified node name for your RFC server machine in the SAP router
table.

The Administrator for SAP Utility 6-5


PACK4SAPR3.book Page 6 Thursday, January 15, 2004 11:31 AM

2. Select Use load balancing to use load balancing when connecting to R/3.
The Application Server and the System Number controls are replaced by
Message Server, System ID, and Group controls so that connection
details specific to load balancing can be entered (see “Load Balancing” on
page 6-6).
3. Specify the default SAP logon details:
• User Name. The name of the user for connecting to SAP.
• Password. The password for User Name.
• Client Number. The SAP client number.
• Language. The language used for connecting to SAP.

Load Balancing
Select Use Load balancing on the Connection and Logon Details page of the
Connection Properties dialog (see page 6-5) to balance loads when connecting to
the R/3 system. Load balancing works as follows:
1. R/3 lets you make logon connections through a message server. The message
server uses an algorithm that considers server workload and availability to
choose an appropriate application server to handle the logon.
When connections are configured, you can choose a load balancing connection
to a message server rather than a specific R/3 instance to retrieve and validate
IDoc types and meta data.
2. If load balancing is selected for this connection, the listener server uses
load balancing when returning status updates to R/3.
3. If load balancing is selected, the plug-in runtime component uses load
balancing when connecting to R/3 to validate IDoc type meta data at job
execution time.
The listener server does NOT use this connection to listen for IDocs arriving on an
RFC port since this is an RFC server connection. Load balancing is only a client
connection feature.

6-6 PACK for SAP R/3


PACK4SAPR3.book Page 7 Thursday, January 15, 2004 11:31 AM

IDoc Listener Settings Page


Click IDoc Listener Settings to open the page:

This page lets you make an additional load balancing test connection as follows:
1. Select Listen for IDocs received through this connection so that a listener
server runs continuously on the DataStage server machine. This check box is
selected by default. If selected, this option indicates that the connection is
enabled for use with IDoc Extract or IDoc Load.
If Listen for IDocs received through this connection is cleared, the other
controls on the IDoc Listener Settings page and DataStage Job Options for
IDocs page are disabled (including the labels for the controls). Also if cleared,
the Import Into, IDoc Types, and IDoc Log buttons on the DataStage Connec-
tions for R/3 page of the DataStage Administrator for SAP utility are disabled
when you select that connection.
2. Specify IDoc Listener Program ID. The listener registers this program ID
with the R/3 system. Use the same program ID for the tRFC port on the
SAP R/3 system to be invoked when the R/3 system sends an IDoc to
DataStage.
3. Clear Return status update upon successful receipt of IDoc to prevent
the listener from sending status messages to the R/3 system when IDocs

The Administrator for SAP Utility 6-7


PACK4SAPR3.book Page 8 Thursday, January 15, 2004 11:31 AM

are received. (It is set by default.) Do this to reduce the load incurred by
the R/3 system when it sends IDocs to DataStage.
4. If you select Use load balancing on the Connection and Logon Details
page (see page 6-5), and the load balancing test connection succeeds, an
additional test connection is made using the IDoc Listener SAP Connec-
tion Details that you specify.
If this test connection fails, the following warning appears:
Currently unable to connect to SAP using the IDoc Listener
connection information specified for this connection. Do you
want to save your changes anyway?
Separate connection details must be provided for the listener, since the
Listener cannot make a load-balancing connection. If load balancing is indi-
cated on the Connection and Logon Details page, you can modify the
Application Server, System Number, and Router String controls shown on
the IDoc Listener Settings page.
The Application Server, System Number, and Router String information is
used by the listener when it connects to the R/3 system. If you clear Use load
balancing on the Connection and Logon Details page, the listener uses the
connection details specified on that page. In that case, the Application Server,
System Number, and Router String controls on the IDoc Listener Settings
page displays values copied from the Connection and Logon Details page,
but these values are read-only (disabled).

DataStage Job Options for Idocs Page


This page contains information used to run DataStage jobs automatically when a
specified number of IDocs of a given type are received from the R/3 system
specified for the connection. (This feature also depends on properties that are
defined elsewhere for each IDoc Type.) The properties default to the current
DataStage logon details for the user.

6-8 PACK for SAP R/3


PACK4SAPR3.book Page 9 Thursday, January 15, 2004 11:31 AM

Click DataStage Job Options for Idocs to open the page:

To specify job options and add the new connection:


1. Select Run appropriate DataStage jobs automatically after receiving IDocs
from this SAP system to run jobs automatically after the specified number of
IDocs of a given type are received from the R/3 system specified for the
connection.
Default DataStage Logon Details for Running the Jobs displays the default
user name and password for the DataStage user.
2. Click Add to add the new connection. (If you open the Select DataStage
Connection to SAP dialog using Add, you see Add here. If you open the
Connection Properties dialog using Properties…, you see OK here.)
3. Specify Port Version. This property specifies the version of the port used
to send IDocs to DataStage. It lets the R/3 system appear to a third-party
application such as DataStage as if the R/3 system has an earlier version
than it actually has. It lets you tell DataStage to expect IDocs to have an
earlier version of control record and administrative field meta data. Thus,
the third-party applications do not have to change when the R/3 system
is upgraded (see the next section).

The Administrator for SAP Utility 6-9


PACK4SAPR3.book Page 10 Thursday, January 15, 2004 11:31 AM

Reading IDoc Meta Data


The Port Version list on the DataStage Job Options for IDocs page of the
Connection Properties dialog (see page 2-12) specifies the version of the port that
sends IDocs to DataStage. This property lets the R/3 system appear to a third-party
application such as DataStage as if the R/3 system has an earlier version than it
actually has. It lets you tell DataStage to expect IDocs to have an earlier version of
control record and administrative field meta data. Thus, third-party applications
do not have to change when the R/3 system is upgraded. This property is related
to the R/3 Version list on the IDoc Type Properties page (see “Defining IDoc Type
Properties” on page 3-26).
Select one of the following:
• 4.x. More recent (4.0A or later) meta data is used for control records and
administrative fields. 4.x is the default version. If you select 4.x, but the
specified R/3 system is version 3.2 or earlier, the following error occurs
when you try to save your changes to the connection. (No error occurs if
the connection is unsuccessful.)

• 3.0/3.1. Stages that read IDocs for this connection use meta data for IDoc
control records and segment administrative fields that corresponds to the
meta data used for R/3 Version 3.1 or earlier systems. This meta data is
used even if the R/3 system specified in the connection actually has a later
version.

Importing New Connections


To import new DataStage connections to SAP:
1. Click Add… ➤ Import… from the DataStage Connections to R/3 page to
import new connections. The list control shows the IDoc types whose configu-
rations are to be imported. This includes only those IDoc types that are not
already configured for the connection.
2. A standard Open File dialog opens that lets you select the export file to
be imported. Normally, the export file is created by exporting a connec-
tion from another DataStage installation (see “Exporting Connections” on
page 6-13.)

6-10 PACK for SAP R/3


PACK4SAPR3.book Page 11 Thursday, January 15, 2004 11:31 AM

3. The Add Connection dialog opens, with property values that default to
those of the connection being imported. If the name of the connection is
the same as that of an existing connection, you are asked if you want to
overwrite the existing connection.
4. The Import IDoc Type Configurations dialog opens, which shows the
name and description of the newly imported connection, and lists the
IDoc types whose configurations are imported with the connection (see
the following section).

Importing IDoc Type Configurations


To import IDoc type configurations into existing connections:
1. Click Import into… from the DataStage Connections to SAP page to open
the standard Open File dialog. This lets you select a file to import.
2. The Import IDoc Type Configurations dialog opens. This dialog displays
the set of IDoc types that have been configured for the connection whose
IDoc types you have chosen to import. These types are imported into an
existing connection whose name and description are also displayed in the
dialog.
This is the same dialog used when importing an entire connection using the
Import menu pick. (The list control for the Import menu pick showing the
IDoc types whose configurations are to be imported includes only those IDoc
types that are not already configured for the connection.)
The capability to export SAP connection definitions with their associated IDoc
types lets you easily migrate your configurations from one DataStage server
system to another.

The Administrator for SAP Utility 6-11


PACK4SAPR3.book Page 12 Thursday, January 15, 2004 11:31 AM

The Import IDoc Type Configurations dialog:

3. The following information is displayed:


• Import into Connection. The existing connection receiving imported IDoc
types.
• Description. A description of the connection.
• Import configurations for these IDoc Types. A list of the IDoc types
whose configurations are imported.
4. Click Configuration… to view and modify the configuration for the
selected IDoc type. The list control has check boxes for each item, so you
can import configurations for a subset of the IDoc types shown in the list.
5. When you click OK, the configurations are imported. As each checked
IDoc type configuration is imported, the IDoc type disappears from the
list control.
6. If any errors occur while importing a configuration, the import process
stops, the IDoc type with the problem is highlighted, and the problem is
reported in a message box. You can resolve the problem by changing the
properties or skip the import for this particular IDoc type by clearing its
check mark and clicking OK again so any remaining configurations are
imported.

6-12 PACK for SAP R/3


PACK4SAPR3.book Page 13 Thursday, January 15, 2004 11:31 AM

Exporting Connections
To export connections by saving configuration information into a file:
1. Click Export… from the DataStage Connections to SAP page to open the
Export Connection dialog. This dialog lets you save the following configura-
tion information for the selected connection and all its associated IDoc types
into a file:

• Connection. The connection whose configuration you want to export.


• Description. A description of the connection.
2. The Export configurations for these IDoc Types control lists all IDocs
types that are configured for the connection. You can export only a subset
of these configurations by clearing the check boxes next to the IDoc types
whose configurations you do not want.
3. Click Exported Properties… to modify the connection properties to be
exported.
4. Click Exported Configuration… to modify the IDoc type configuration
details to be exported for a selected IDoc type.
5. After you click OK, a Save As dialog appears so you can specify the name
and location of the export file. The file is given a .cxp extension.

The Administrator for SAP Utility 6-13


PACK4SAPR3.book Page 14 Thursday, January 15, 2004 11:31 AM

About the IDoc Types Dialog


After you export connections, saving configuration information into a file,
continue on the DataStage Connections to SAP page as follows.
To set the properties of the IDoc types for the selected connection:
1. Open the IDoc Types dialog by clicking on IDoc Types on the DataStage
Connections to SAP page.
This dialog is similar to the Select IDoc Type dialog (see “Selecting IDoc
Types” on page 3-23), but OK is replaced by Close, and there is no Cancel.

The Connection field specifies the connection whose IDoc types are displayed
in the list with descriptive text in the Description field.
2. Click Find… to open a Find IDoc Type dialog to search for IDoc types
that contain user-specified substrings in their name or description as in
the Select IDoc Type dialog (see “Selecting IDoc Types” on page 4-10).
3. Click Properties… to examine and change the DataStage configuration
for the selected IDoc type using the IDoc Type Properties dialog (see
“Defining IDoc Type Properties” on page 3-26). (You can also do this by
double-clicking an IDoc type in the list.)

6-14 PACK for SAP R/3


PACK4SAPR3.book Page 15 Thursday, January 15, 2004 11:31 AM

About the IDoc Log Dialog


To view log messages about the IDoc listener associated with a connection:
1. Open the IDoc Log dialog by clicking IDoc Log on the DataStage Connec-
tions to SAP page.

The Connection field specifies the connection whose IDoc log messages are
displayed. Descriptive text is included in the Description field.
2. The IDoc Log Messages field lists log messages about the activities of the
IDoc listener that is associated with the connection. When the dialog first
opens, this list is automatically scrolled to the end so that the most recent
messages are visible.
3. Click Refresh to reload the log messages, including any that were gener-
ated since you first opened the dialog.
4. Click Clear Log to delete the messages currently in the log (after you
provide confirmation) and refresh the display.

About the IDoc Cleanup and Archiving Page


This page lets you indicate the frequency and timing of automatic cleanup (and
archiving, depending on the setting of a particular IDoc type configuration) of the
files that temporarily store IDocs awaiting processing by DataStage jobs.

The Administrator for SAP Utility 6-15


PACK4SAPR3.book Page 16 Thursday, January 15, 2004 11:31 AM

Note: Use this administrative utility as you set Archive processed IDoc files on
the IDoc Type Properties dialog to archive IDoc files (see “Defining IDoc
Type Properties” on page 3-26).

Additionally, you can achieve a level of control by running the cleanup executable
manually from the command line on the DataStage server machine. The dsidoccln
command is in the bin directory of the DataStage home directory (<dshome>).
Open the IDoc Cleanup and Archiving page from the DataStage Administrator
for SAP:

To set the frequency and timing of automatic cleanup:


1. Set the Frequency to indicate whether cleanup should be done daily, weekly
(on a particular day of the week), or not at all (that is, never). The default is
weekly every Saturday.
2. Specify the Time (of day) that cleanup should be performed. If the
Frequency control is set to Never, Time is disabled.
3. Click Advanced Settings to open the Advanced Settings for IDoc
Cleanup and Archiving dialog (see the following section).
4. Changes to the settings on this page are applied as soon as you close the
application or switch to the DataStage Connections to SAP page.

6-16 PACK for SAP R/3


PACK4SAPR3.book Page 17 Thursday, January 15, 2004 11:31 AM

Advanced Settings for IDoc Cleanup and Archiving


Click Advanced Settings on the IDOC Cleanup and Archiving page of the
DataStage Administrator for SAP utility to open the Advanced Settings for IDoc
Cleanup and Archiving dialog. This dialog explains the purpose of the Job
Inactivity Timeout setting, and lets you modify it.

If you select No Timeout, jobs are always assumed to be active, regardless of how
much time has passed since the jobs last ran.
Selecting this check box disables Job Inactivity Timeout.

Note: Setting a timeout of 0 days is not equivalent to selecting No timeout.


Setting the timeout to 0 days causes the cleanup process to treat jobs as
inactive no matter how recently they have run.

The Administrator for SAP Utility 6-17


PACK4SAPR3.book Page 18 Thursday, January 15, 2004 11:31 AM

6-18 PACK for SAP R/3


PACK4SAPR3.book Page 1 Thursday, January 15, 2004 11:31 AM

A
SAP Authorization
Requirements for ABAP

This appendix describes the SAP authorization requirements to run Version 3.0.1r1
of the ABAP Extract for SAP R/3 plug-in. It documents how to manually create an
Z_DS_PROFILE authorization profile and the Z_RFC_DS_SERVICE RFC
function module in SAP R/3.
You already have these two components installed on your R/3 system if you
imported transport requests as described in “Integrating DataStage with SAP R/3
Systems” on page 2-4. However, if this import is unsuccessful, you can manually
create these components in your R/3 systems (see “Creating an Authorization
Profile Manually” and “Installing the RFC Function Code Manually” on page A-2
and page A-5 respectively).
The appendix also describes the configuration of the SAP Dispatch/Gateway
service. It further includes information about packaged extraction jobs that include
common extraction scenarios which you can customize for your environment.

SAP Authorization Requirements


When third party products need to communicate with R/3 systems, system
administrators and project managers must enforce security for user access to the
business information.
Version 3.0.1r1 of the DataStage ABAP Extract for R/3 plug-in addresses this
concern and provides a comprehensive and flexible way to communicate with R/3
systems without compromising the security aspects that already exist.
Depending on the level of security desired and the type of system (DEV, QAS, or
PRD) from which to extract data, you can assign the standard SAP S_A.DEVELOP

SAP Authorization Requirements for ABAP A-1


PACK4SAPR3.book Page 2 Thursday, January 15, 2004 11:31 AM

profile or Z_DS_PROFILE that is shipped as a transport request on the DataStage


PACK for SAP R/3 client CD to ABAP Extract plug-in users. The Z_DS_PROFILE,
which you can install by importing the transport request or create manually,
contains the minimum authorizations for designing and running DataStage jobs:
These limited authorizations ensure that ABAP Extract plug-in users can execute
only those transactions to which they have permissions and are denied access to
other transactions. For more details about each authorization, see “Creating an
Authorization Profile Manually” on page A-2.

Creating an Authorization Profile Manually


The following sections describe how to create an Z_DS_PROFILE authorization
profile manually.
This profile contains these SAP authorization requirements:
• RFC authorization for DataStage
• Authorization for DataStage
• CPI-C authorization for DataStage
• Table display authorization for DataStage
The SAP Administrator needs to create a profile in the R/3 system. This profile
contains the authorization objects referenced in the following text with their
respective authorizations. The Administrator must then assign this profile to
DataStage Extract Pack users.
The SAP Administrator should do the following:
1. Create a Z_DS_PROFILE profile (SAP Transaction SU02) containing the
following authorization objects for RFC authorization for DataStage. Define
the authorizations, which are described in the following sections, before the
profile is created.
a. Authorization Objects. The standard SAP objects.
b. Authorizations. The authorizations that must be created with the values
defined in the following sections (SAP Transaction SU03). In some cases,
you can use a standard SAP authorization as suggested. You can use the

A-2 PACK for SAP R/3


PACK4SAPR3.book Page 3 Thursday, January 15, 2004 11:31 AM

naming conventions for your organization instead of those used in these


sections, that is, Z:DS_XXX.

Authorization
Object Text Authorization
S_RFC Authorization check for RFC access Z:DS_RFC
S_DATASET Authorization for file access Z:DS_DATASET
S_CPIC CPIC Calls from ABAP programs Z:DS_CPIC
S_TABU_DIS Table maintenance (using standard tools, Z:DS_TABU
such as SM30)

2. Assign the profile to ABAP Extract Pack users.

RFC Authorization for DataStage


Define the following authorization:
• Authorization. Z:DS_RFC - no standard authorization available
• Text. RFC authorization for DataStage
• Class. AAAB - cross-application authorization objects
• Object. S_RFC - authorization check for RFC access

Field Description Value


ACTVT Activity 16
RFC_NAME Name of RFC to be protected CADR, RFC1, SDTX, Y*, Z*
RFC_TYPE Type of RFC object to be protected FUGR

Authorization for DataStage


Define the following authorization:
• Authorization. Z:DS_DATASET (you can use the standard authorization
S_DATASET_AL)
• Text file access. Authorization for DataStage
• Class. BC_A - Basis: Administration
• Object. S_DATASET - Authorization for file access

Field Description Value


ACTVT Activity 34
PROGRAM ABAP program name Y*, Z*
FILENAME Physical file name *

SAP Authorization Requirements for ABAP A-3


PACK4SAPR3.book Page 4 Thursday, January 15, 2004 11:31 AM

CPI-C Authorization for DataStage


Define the following authorization:
• Authorization. Z:DS_CPIC (you can use the standard authorization
S_SAPCPIC)
• Text. CPI-C authorization for DataStage
• Class. BC_A - Basis: Administration
• Object. S_CPIC - CPIC call from ABAP programs

Field Description Value


ACTVT Activity 37
PROGRAM ABAP program name Y*, Z*
CPICDEST Symbolic destination ''
ABAPFORM Name of an ABAP FORM routine CPIC

Table Display Authorization for DataStage


Define the following authorization:
• Authorization. Z:DS_TABU (you can use the standard authorization
S_TABU_SHOW)
• Text. Table display authorization for DataStage
• Class. BC_A - Basis: Administration
• Object. S_TABU_DIS - table maintenance (using standard tools, such as
SM30)

Field Description Value


ACTVT Activity 03
DISBERCLS Authorization group *

Note: To upload or delete ABAP code, SAP users (CPIC and Dialog) must be
registered within SAP as developers. To do this, log on to the Online
Support System (OSS), and select the appropriate SAP installation number.
Go to Register ➤ Register Developer, and type the SAP user name. OSS
provides a 20-character access key. Record this access key, and create a
dummy ABAP program. When SAP prompts you for an access key, type
this OSS key, and the system will confirm it.

A-4 PACK for SAP R/3


PACK4SAPR3.book Page 5 Thursday, January 15, 2004 11:31 AM

Installing the RFC Function Code Manually


To install an RFC manually for version 3.1 or later of SAP R/3, you need to
consider installation prerequisites before you begin the installation. The screens in
the following sections may differ depending on your SAP R/3 version.
Installation Prerequisites. To install the Z_RFC_DS_SERVICE RFC function code
manually, follow these guidelines:
• You must be the SAP Administrator or an SAP user with S_A.DEVELOP
authorization to create the RFC function.
• You can create a function group or find an existing function group. If you
use an existing function group and you are not the SAP Administrator, you
must be the owner of the function group.
Installation Steps. To install the RFC code:
1. Log on to SAP R/3.
2. Start Transaction SE37. The Function Builder: Initial Screen dialog box
appears. Do the following:
a. Type Z_RFC_DS_SERVICE in the Function module box.
b. Click Create to open the Create Function Module dialog box. The Create
Function Module dialog box appears.
3. To create a function module:
a. Type the name of the function group in which you want the RFC function
code to be created in the Function group field.
b. Optional. Type Datastage General Service Utility in the Short
text field.
c. Click Save to create the module.
4. The Attributes page of the Function Builder: Change
Z_RFC_DS_SERVICE dialog box appears.

SAP Authorization Requirements for ABAP A-5


PACK4SAPR3.book Page 6 Thursday, January 15, 2004 11:31 AM

Specify the following values on the Attributes page:

Use the defaults for the remaining entries:


• Application: Z
• Short text: DataStage General Service Utility
• Processing type: Remote-enabled module
• Update module: Start immed.
If you need help determining which function group to use, see step 2 in “Trou-
bleshooting for RFC Installation for SAP R/3” on page A-17.
5. Click the Import page to enter the import parameters. New service types
exist in Z_RFC_DS_SERVICE so that job parameters can be passed to
ABAP programs that run in the background. For details about running
these programs in the background, see “Running Programs in the Back-
ground” on page 2-15.

A-6 PACK for SAP R/3


PACK4SAPR3.book Page 7 Thursday, January 15, 2004 11:31 AM

The Import page appears:

Type the following values shown on this sample Import page for the Param-
eter name, Type, Reference type, Default value, Optional, and Pass value
fields:

Default
Parameter name Type Reference type value Optional Pass value
I_SERVE_TYPE LIKE SY-TFILL Yes
I_TABLE_NAME LIKE DD02T-TABNAME Yes Yes
I_REPORT_NAME LIKE D010SINF-PROG Yes Yes
I_DELIMITER LIKE SONV-FLAG SPACE Yes Yes
I_NO_DATA LIKE SONV-FLAG SPACE Yes Yes
I_ROWSKIPS LIKE SOID-ACCNT 0 Yes Yes
I_ROWCOUNT LIKE SOID-ACCNT 0 Yes Yes
I_CURR_VARIANT LIKE RSVAR-VARIANT YES YES
I_VARI_DESC LIKE VARID YES YES

SAP Authorization Requirements for ABAP A-7


PACK4SAPR3.book Page 8 Thursday, January 15, 2004 11:31 AM

6. Click the Export page to enter the export parameters.

Type the following values shown on this sample Export page for the Param-
eter name, Type spec., Reference type, and Pass val. fields:

Parameter name Type spec. Reference type Pass val.


O_TABLE_SIZE LIKE SY-TFILL Yes
O_ERROR_INCLUDE LIKE SY-REPID Yes
O_ERROR_LINE LIKE SY-INDEX Yes
O_ERROR_MESSAGE LIKE RSLINLMSG-MESSAGE Yes
O_ERROR_OFFSET LIKE SY-TABIX Yes
O_ERROR_SUBRC LIKE SY-SUBRC Yes
O_SAP_VER LIKE SVERS-VERSION Yes
O_DSS_VER LIKE SVERS-VERSION Yes
O_VARIANT LIKE RSVAR-VARIANT Yes

A-8 PACK for SAP R/3


PACK4SAPR3.book Page 9 Thursday, January 15, 2004 11:31 AM

7. Click the Tables page to enter the table parameters.

Type the values shown here on this sample Tables page for the Parameter
name, Type spec., Reference type, and Optional fields according to the
following table:

Parameter name Type spec. Reference type Optional


T_SQLCON LIKE RFC_DB_OPT Yes
T_PARAM LIKE RSPARAMS Yes
T_CODE LIKE PROGTAB Yes
T_FIELDS LIKE RFC_DB_FLD Yes
T_DATA LIKE TAB512 Yes
T_VARI_TEXT LIKE VARIT Yes

8. Click the Exceptions page to enter the exceptions parameters.

SAP Authorization Requirements for ABAP A-9


PACK4SAPR3.book Page 10 Thursday, January 15, 2004 11:31 AM

The Exceptions page appears:

Type the following values shown on this Exceptions page for the Exception
field:

Exception
TABLE_NOT_FOUND
WRONG_TYPE
TABLE_WITHOUT_DATA
OPTION_NOT_VALID
FIELD_NOT_VALID
DATA_BUFFER_EXCEEDED
REPORT_NOT_FOUND

9. Save the RFC interface definition.

A-10 PACK for SAP R/3


PACK4SAPR3.book Page 11 Thursday, January 15, 2004 11:31 AM

10. Click Source code. The source code appears in the ABAP Editor window:

To replace this default code with the source code provided on the CD, that is,
the ZDSRFC.TXT file in the RFC directory:
a. Choose Utilities ➤ more utilities.
b. Choose Upload/Download ➤ Upload from the ABAP Editor menu.
c. Specify the pathname for the ZDSRFC.TXT file in the RFC directory on the
CD. For example, type E:\Datastage\RFC\ZDSRFC.TXT in the File
name field.
11. After the upload is complete, save the RFC function code, and activate it.

SAP Authorization Requirements for ABAP A-11


PACK4SAPR3.book Page 12 Thursday, January 15, 2004 11:31 AM

Configuring the SAP/Dispatch/Gateway Service


In order to use the DataStage PACK for SAP R/3 correctly, you must add entries
for the SAP Dispatch/Gateway service to the services file for your DataStage client
and server systems. If SAP R/3 is already configured on the DataStage client and
server systems, these entries may already be added.
The location of the services file depends on your platform:
Windows NT: \winnt\system32\drivers\etc\services
Unix: /etc/services
Add the following entries to the services file:
#
# SAP Port
#
sapdp00 3200/tcp
sapdp01 3201/tcp
sapdp02 3202/tcp
sapdp03 3203/tcp
sapdp04 3204/tcp
sapdp05 3205/tcp
sapdp06 3206/tcp
sapdp07 3207/tcp
sapdp08 3208/tcp
sapdp09 3209/tcp
sapdp10 3210/tcp
sapdp11 3211/tcp
sapdp12 3212/tcp
sapdp13 3213/tcp
sapdp14 3214/tcp
sapdp15 3215/tcp
sapdp16 3216/tcp
sapdp17 3217/tcp
sapdp18 3218/tcp
sapdp19 3219/tcp
sapdp20 3220/tcp
sapdp21 3221/tcp
sapdp22 3222/tcp
sapdp23 3223/tcp
sapdp24 3224/tcp

A-12 PACK for SAP R/3


PACK4SAPR3.book Page 13 Thursday, January 15, 2004 11:31 AM

sapdp25 3225/tcp
sapdp26 3226/tcp
sapdp27 3227/tcp
sapdp28 3228/tcp
sapdp29 3229/tcp
sapdp30 3230/tcp
sapdp31 3231/tcp
sapdp32 3232/tcp
sapdp33 3233/tcp
sapdp34 3234/tcp
sapdp35 3235/tcp
sapdp36 3236/tcp
sapdp37 3237/tcp
sapdp38 3238/tcp
sapdp39 3239/tcp
sapdp40 3240/tcp
sapdp41 3241/tcp
sapdp42 3242/tcp
sapdp43 3243/tcp
sapdp44 3244/tcp
sapdp45 3245/tcp
sapdp46 3246/tcp
sapdp47 3247/tcp
sapdp48 3248/tcp
sapdp49 3249/tcp
sapdp50 3250/tcp
sapdp51 3251/tcp
sapdp52 3252/tcp
sapdp53 3253/tcp
sapdp54 3254/tcp
sapdp55 3255/tcp
sapdp56 3256/tcp
sapdp57 3257/tcp
sapdp58 3258/tcp
sapdp59 3259/tcp
sapdp60 3260/tcp
sapdp61 3261/tcp
sapdp62 3262/tcp
sapdp63 3263/tcp
sapdp64 3264/tcp
sapdp65 3265/tcp
sapdp66 3266/tcp

SAP Authorization Requirements for ABAP A-13


PACK4SAPR3.book Page 14 Thursday, January 15, 2004 11:31 AM

sapdp67 3267/tcp
sapdp68 3268/tcp
sapdp69 3269/tcp
sapdp70 3270/tcp
sapdp71 3271/tcp
sapdp72 3272/tcp
sapdp73 3273/tcp
sapdp74 3274/tcp
sapdp75 3275/tcp
sapdp76 3276/tcp
sapdp77 3277/tcp
sapdp78 3278/tcp
sapdp79 3279/tcp
sapdp80 3280/tcp
sapdp81 3281/tcp
sapdp82 3282/tcp
sapdp83 3283/tcp
sapdp84 3284/tcp
sapdp85 3285/tcp
sapdp86 3286/tcp
sapdp87 3287/tcp
sapdp88 3288/tcp
sapdp89 3289/tcp
sapdp90 3290/tcp
sapdp91 3291/tcp
sapdp92 3292/tcp
sapdp93 3293/tcp
sapdp94 3294/tcp
sapdp95 3295/tcp
sapdp96 3296/tcp
sapdp97 3297/tcp
sapdp98 3298/tcp
sapdp99 3299/tcp
sapgw00 3300/tcp
sapgw01 3301/tcp
sapgw02 3302/tcp
sapgw03 3303/tcp
sapgw04 3304/tcp
sapgw05 3305/tcp
sapgw06 3306/tcp
sapgw07 3307/tcp
sapgw08 3308/tcp

A-14 PACK for SAP R/3


PACK4SAPR3.book Page 15 Thursday, January 15, 2004 11:31 AM

sapgw09 3309/tcp
sapgw10 3310/tcp
sapgw11 3311/tcp
sapgw12 3312/tcp
sapgw13 3313/tcp
sapgw14 3314/tcp
sapgw15 3315/tcp
sapgw16 3316/tcp
sapgw17 3317/tcp
sapgw18 3318/tcp
sapgw19 3319/tcp
sapgw20 3320/tcp
sapgw21 3321/tcp
sapgw22 3322/tcp
sapgw23 3323/tcp
sapgw24 3324/tcp
sapgw25 3325/tcp
sapgw26 3326/tcp
sapgw27 3327/tcp
sapgw28 3328/tcp
sapgw29 3329/tcp
sapgw30 3330/tcp
sapgw31 3331/tcp
sapgw32 3332/tcp
sapgw33 3333/tcp
sapgw34 3334/tcp
sapgw35 3335/tcp
sapgw36 3336/tcp
sapgw37 3337/tcp
sapgw38 3338/tcp
sapgw39 3339/tcp
sapgw40 3340/tcp
sapgw41 3341/tcp
sapgw42 3342/tcp
sapgw43 3343/tcp
sapgw44 3344/tcp
sapgw45 3345/tcp
sapgw46 3346/tcp
sapgw47 3347/tcp
sapgw48 3348/tcp
sapgw49 3349/tcp
sapgw50 3350/tcp

SAP Authorization Requirements for ABAP A-15


PACK4SAPR3.book Page 16 Thursday, January 15, 2004 11:31 AM

sapgw51 3351/tcp
sapgw52 3352/tcp
sapgw53 3353/tcp
sapgw54 3354/tcp
sapgw55 3355/tcp
sapgw56 3356/tcp
sapgw57 3357/tcp
sapgw58 3358/tcp
sapgw59 3359/tcp
sapgw60 3360/tcp
sapgw61 3361/tcp
sapgw62 3362/tcp
sapgw63 3363/tcp
sapgw64 3364/tcp
sapgw65 3365/tcp
sapgw66 3366/tcp
sapgw67 3367/tcp
sapgw68 3368/tcp
sapgw69 3369/tcp
sapgw70 3370/tcp
sapgw71 3371/tcp
sapgw72 3372/tcp
sapgw73 3373/tcp
sapgw74 3374/tcp
sapgw75 3375/tcp
sapgw76 3376/tcp
sapgw77 3377/tcp
sapgw78 3378/tcp
sapgw79 3379/tcp
sapgw80 3380/tcp
sapgw81 3381/tcp
sapgw82 3382/tcp
sapgw83 3383/tcp
sapgw84 3384/tcp
sapgw85 3385/tcp
sapgw86 3386/tcp
sapgw87 3387/tcp
sapgw88 3388/tcp
sapgw89 3389/tcp
sapgw90 3390/tcp
sapgw91 3391/tcp
sapgw92 3392/tcp

A-16 PACK for SAP R/3


PACK4SAPR3.book Page 17 Thursday, January 15, 2004 11:31 AM

sapgw93 3393/tcp
sapgw94 3394/tcp
sapgw95 3395/tcp
sapgw96 3396/tcp
sapgw97 3397/tcp
sapgw98 3398/tcp
sapgw99 3399/tcp

Troubleshooting for RFC Installation for SAP R/3


This troubleshooting information applies to all versions of SAP R/3. If the RFC
utility does not work, do the following:
1. Verify that the settings are correct on the Attributes page for Transaction
SE37.
2. If you do not know which function group to use, start Transaction SE84
to see a list.
a. Click the Programming ➤ Function Builder, then double-click Function
groups. The R/3 Repository Information System: Function groups dialog
appears, where you specify the selection criteria.
b. Type z* in the Func. group name field, and execute it to display a list of
available function groups. The Function groups dialog appears.
c. If you are not sure which group to use, start a new Transaction SE80. The
Object Navigator dialog appears.
d. Several functions are displayed under the function modules branch.
Display the different function groups (as listed in SE84 in step 2) until you
find the one you want. The Import page appears.
3. Verify that the parameters are correct for the Import page according to the
following guidelines:
a. The LIKE entry for Type is case-sensitive.
b. The Optional value must be clear for the I_SERV_TYPE parameter. The
other parameters must have Optional selected.
c. The RFC function cannot pass a value by reference. Pass value must be
selected.
4. Verify that the parameters for the Export, Tables, and Exceptions tabs are
correct.

SAP Authorization Requirements for ABAP A-17


PACK4SAPR3.book Page 18 Thursday, January 15, 2004 11:31 AM

5. Click Source code. The function module header appears at the top of the
code.
The IMPORTING and EXPORTING values must be passed by VALUE, not
REFERENCE. Instead of uploading the code into the Editor, do the following:
a. Copy the code in the ZDSRFC.TXT file, from TABLES through the last
ENDIF, to the clipboard.
b. Paste the code to the Editor in the function module.
c. Check that the syntax for this function procedure displays No syntax
error found.
6. Activate Z_RFC_DS_SERVICE. Do this by selecting Function module ➤
Activate (or Ctrl+F3) to activate the code so that it can function properly.
7. As part of the installation process, test that the function works in your
environment by testing the version number or the table content:
SAP and RFC code version number. Do the following:
a. Start Transaction SE37, and type Z_RFC_DS_SERVICE.
b. Press F8. The Test Function Module: Initial Screen appears.
c. Click Execute. The Test Function Module: Result Screen appears. The
SAP version number (for example, 46B) and the RFC utility version
number are displayed.
Table content. Do the following:
a. Start Transaction SE37, and type Z_RFC_DS_SERVICE in the Function
module field.
b. Press F8. The Test Function Module: Initial Screen appears.
c. Type 5 in the I_SERVE_TYPE field.
d. Type T000 for I_TABLE_NAME.
e. Click Execute. The Test Function Module: Result Screen appears. If
T_FIELDS and T_DATA contain records, the RFC utility works properly.

Packaged Extractions
The DataStage ABAP Extract for SAP R/3 plug-in includes pre-packaged
extraction jobs that help you understand common extraction scenarios.

A-18 PACK for SAP R/3


PACK4SAPR3.book Page 19 Thursday, January 15, 2004 11:31 AM

These pre-packaged extractions are provided as a DataStage export file


(Extracts.dsx). This file is located on the DataStage PACK for SAP R/3 client CD in
the templates directory. Use the DataStage Manager import function to import any
or all of the jobs contained in the file.
You can customize these jobs for your ETL initiatives by providing parameters
specific to your environment and by using the extraction object and corresponding
ABAP that already exist in the job. The ABAP Extract plug-in also includes jobs
that extract long text fields from R/3. The following list describes these jobs and
the SAP tables from which they extract data:
• LongTextCPIC. This job uses CPIC data transfer method to extract long
text from VarChar fields. It decompresses them using the READ_TEXT
function module.

Note: The Extraction Object and the corresponding ABAP program cannot
be regenerated since it overwrites the code to call the function
module.

• LongTextFTP. This job uses the FTP data transfer method to extract long
text from VarChar fields. It decompresses them using the READ_TEXT
function module.

Note: The Extraction Object and the corresponding ABAP program cannot
be regenerated as this results in overwriting the code to call the func-
tion module. If Path of Remote File in the Output ➤ Runtime tab is
changed, you must manually search for the old path in the ABAP
program and replace it with the new one.

• MARAextract. Extracts a range of materials from MARA and their corre-


sponding descriptions from the MAKT table.
• MaterialExtract. Extracts data from the primary material tables, namely,
MARA, MARM, MBEW, MVKE, MARC, MARD, MCHB, MKOL, MLGN,
and MLGT.
• SalesExtract. Extracts data from the VBAK and VBAP tables.
• SalesCustomerMasterExtract. Extracts data from the KNA1, KNB1, and
KNVV tables.

SAP Authorization Requirements for ABAP A-19


PACK4SAPR3.book Page 20 Thursday, January 15, 2004 11:31 AM

A-20 PACK for SAP R/3


PACK4SAPR3.book Page 1 Thursday, January 15, 2004 11:31 AM

B
Packaged Extractions for
ABAP

The DataStage ABAP Extract for SAP R/3 plug-in includes pre-packaged
extraction jobs that help you understand common extraction scenarios.
These pre-packaged extractions are provided as a DataStage export file
(Extracts.dsx). This file is located on the DataStage PACK for SAP R/3 client CD in
the templates directory. Use the DataStage Manager import function to import any
or all of the jobs contained in the file.
You can customize these jobs for your ETL initiatives by providing parameters
specific to your environment and by using the extraction object and corresponding
ABAP that already exist in the job. The ABAP Extract plug-in also includes jobs
that extract long text fields from R/3. The following list describes these jobs and
the SAP tables from which they extract data:
• LongTextCPIC. This job uses CPIC data transfer method to extract long
text from VarChar fields. It decompresses them using the READ_TEXT
function module.

Note: The Extraction Object and the corresponding ABAP program cannot
be regenerated since it overwrites the code to call the function
module.

• LongTextFTP. This job uses the FTP data transfer method to extract long
text from VarChar fields. It decompresses them using the READ_TEXT
function module.

Note: The Extraction Object and the corresponding ABAP program cannot
be regenerated as this results in overwriting the code to call the func-

Packaged Extractions for ABAP B-1


PACK4SAPR3.book Page 2 Thursday, January 15, 2004 11:31 AM

tion module. If Path of Remote File in the Output ➤ Runtime tab is


changed, you must manually search for the old path in the ABAP
program and replace it with the new one.

• MARAextract. Extracts a range of materials from MARA and their corre-


sponding descriptions from the MAKT table.
• MaterialExtract. Extracts data from the primary material tables, namely,
MARA, MARM, MBEW, MVKE, MARC, MARD, MCHB, MKOL, MLGN,
and MLGT.
• SalesExtract. Extracts data from the VBAK and VBAP tables.
• SalesCustomerMasterExtract. Extracts data from the KNA1, KNB1, and
KNVV tables.

B-2 PACK for SAP R/3


PACK4SAPR3.book Page 1 Thursday, January 15, 2004 11:31 AM

C
Properties for the IDoc
Extract Plug-In

The IDoc Extract stage contains grid-style stage and output properties that are
visible from the DataStage Designer. These are included for reference only since
you cannot configure the stage properly using the grid editor.
The next table includes the following column heads:
• Prompt is the text that the job designer sees in the stage editor user
interface.
• Description describes the properties.
The IDoc Extract stage supports the properties listed in the following table:

Properties
Prompt Description
USERNAME Stage. The user name used to connect to SAP.
PASSWORD Stage. The password used to connect to SAP.
CLIENT Stage. The client number used to connect to SAP.
DESTINATION Stage. The name of the physical connection defined in the
DSSAPConnections.config fie.
LANGUAGE Stage. The language used to connect to SAP.
GWHOST Stage. The host on which the R/3 application message server
resides.
SYSNBR Stage. The system number of the R/3 instance.
ROUTERSTR Stage. Optional. A string used to connect to the remote SAP
server.

Properties for the IDoc Extract Plug-In C-1


PACK4SAPR3.book Page 2 Thursday, January 15, 2004 11:31 AM

Properties (Continued)
Prompt Description
LOADBALANCING Stage. A Boolean value that determines whether this is a load-
balancing connection.
USEDDEFAULTSAP- Stage. A Boolean value that, if true, determines whether to use
CONNECTION the default values from the DSSAPConnections.config file. All
stage properties up to this point are ignored.
TESTMODE Stage. The value specifying that this job does not update the
bookmark file when it is run. Also, the job is not started auto-
matically by the listener when its batch count threshold is
exceeded.
SEGTYP Output. The segment type of the IDoc to be processed by this
link.

C-2 PACK for SAP R/3


PACK4SAPR3.book Page 1 Thursday, January 15, 2004 11:31 AM

D
File Permissions for IDoc
Extract Plug-In

This section documents permissions on files that are created by the IDoc Extract
Pack. For more information on umask, see the UNIX documentation for your
system.

Configuring Connections and IDoc Types


When a file is created on UNIX systems, it is given a set of default permissions. The
permissions that are assigned are determined by the file creation mask of the
process that creates the file. The umask changes the file creation mask by removing
permissions from the existing mask. It consists of three groups of three bits each:
• The first group relates to permissions for the user who owns the file
• The second group relates to permissions for the group that owns the file
• The third group relates to everyone else
When a file is created, ownership is set to the user id of the process that created the
file. Group ownership is set to the primary group of the user who created the file.
Its permissions are then determined by its file creation mode minus the value of its
umask. Child processes inherit the umask value and user id of a process.
For example, if the default file creation mode of a file is -rw-rw-rw- (666), and the
umask for the process is 002, the file created has permissions of -rw-rw-r-- (664).
This means the user and the user's group have read and write permissions but
everyone else only has read permissions.

File Permissions for IDoc Extract Plug-In D-1


PACK4SAPR3.book Page 2 Thursday, January 15, 2004 11:31 AM

Files created by the IDoc Extract Pack are subject to two sets of permissions,
depending on how they are created:
• The first set belongs to those files that are created by child processes of the
dsidocmgr executable, such as the dsidocsvr executable.
• The second set belongs to those that are created by DataStage client connec-
tions, such as the IDoc stage editor and the DataStage Administrator for
SAP.
These files inherit the user id and group id of the user logged into the
DataStage client and the umask setting of the DataStage server process
owner. Note that the umask is not that of the user logged in to the client.
When the IDoc Extract plug-in is installed, it creates a directory named
DSSAPConnections in the directory where the DataStage server is installed. This
directory is owned by root, and the permissions are set so that everyone has
permissions to read and write, regardless of the umask for the process.
A file called DSSAPConnections.config is created in the DSSAPConnections directory
when a user logs in using the DataStage client and has permissions subject to the
umask with which the DataStage server is started.
Several other files with the .config suffix reside in this directory. (The DataStage
client creates these for internal purposes.)
• The IDoc.SegmentAdminFields.3.config and IDoc.SegmentAdminFields.2.config
files are created the first time a user accesses meta data for an IDoc from a
particular ALE port version.
• The IDocCleanup.config file is created the first time a user accesses the IDoc
Cleanup and Archiving page in the DataStage Administrator for SAP.
• The SAPVersions.config file is created at the time the first connection to SAP
is created.
The user who first logs into DataStage to perform these activities owns these files,
which are subject to the umask setting that the DataStage server was started with.
When a connection to SAP is created, a directory with the same name given to the
connection is created containing a subdirectory called IDocTypes. These directories
are owned by the user who creates the connection, with permissions subject to the
umask setting that the DataStage server was started with.

Note: Users who want to create a connection to SAP either using the stage editor
or the DataStage Administrator for SAP must have write permissions to the
DSSAPConnections.config file and the DSSAPConnections directory.

D-2 PACK for SAP R/3


PACK4SAPR3.book Page 3 Thursday, January 15, 2004 11:31 AM

When the first IDoc type is configured, a file called IDocTypes.config is created in the
IDocTypes directory, so the user who configures this first IDoc type must have write
permissions to the IDocTypes directory. Remember that this directory was created
by the user who defined the connection and may be different than the user
defining the first IDoc type.
When a user configures subsequent IDoc types, that user must have write
permissions to the IDocTypes.config file and permissions to create the directory
location (PSA) to be configured for that type. This directory can be configured
anywhere on the file system, but is created by default in the IDocTypes directory.
In the configured PSA, another file with the .config suffix is created that contains
parameters for this IDoc type. The user who logs into DataStage to configure this
IDoc type owns these files, which are subject to the umask setting that the
DataStage server was started with.
Any user who later modifies the configuration for an IDoc type must have write
permissions to:
• The IDocTypes.config file
• The directory the IDoc type is to be written to
• The IDoc type config file in that directory
If the modification is to specify a new directory location, the user must have
permission to write to the new directory location.

Running DataStage jobs


A DataStage job uses the user id of the user logged in to DataStage to run it, no
matter how the job is started. If the RFC Server runs the job, it runs with the
DataStage user id configured for that connection and type, not the user that owns
the RFC Server process (the RFC Server runs as root).
The first user who runs a job that is not in test mode, which reads IDocs of a
specified type from a specified directory location, must have permission to create
a bookmark file. The bookmark file is created in the directory from which the IDocs
for that type are read. This file is named for the IDoc type with the .bmk suffix and
is owned by the user who creates the file. It has permissions subject to the umask
setting with which the DataStage server was started.
In addition, a file named saprfc.ini exists in all project directories after the first
DataStage job is run. This is the case with any DataStage PACK for SAP R/3, such
as the ABAP Extract and the Load PACK for SAP BW. The user who runs the job
owns the saprfc.ini file, with permissions subject to the umask setting with which
the DataStage server is started. Each subsequent job run removes and recreates this

File Permissions for IDoc Extract Plug-In D-3


PACK4SAPR3.book Page 4 Thursday, January 15, 2004 11:31 AM

file, so all the users who want to run jobs in this project must be able to write to this
file. For example, if user A runs a Load PACK for SAP BW job, and user B
subsequently runs an IDoc Extract job, the job fails unless user B has permission to
remove and recreate this file.

Unconfigured IDoc types


When the RFC Server receives an IDoc that it does not recognize, it creates a
directory in the default location for that IDoc type. Since the RFC Server runs as
root, it can override the umask setting. It does this by creating this directory with
read, write, and execute permissions for user, group, and other so that another user
can later configure that location. All IDoc files that the RFC Server receives and
writes to the PSA are owned by root and subject to the umask settings with which
the RFC Server Manager is started.

Configuring IDoc Archival and Cleanup


The cleanup scheduling mechanism uses the crontab command. Consequently, user
permissions to modify the cleanup schedule are subject to that user's permissions
to execute the crontab command.
Since you cannot use the crontab command to manipulate the crontab files of other
users, the information in the IDOC Cleanup and Archiving page of the DataStage
Administrator for SAP only applies to the user logging in to the tool.
For example, if user A schedules a cleanup for 12:00 every Saturday, and user B
wants to change the cleanup schedule, user B can add this entry, but it will not
remove the entry for user A. Likewise, when user B logs into the tool, the cleanup
settings do not reflect the settings user A established. DataStage has the same
limitation when you log in to the DataStage Director.

Recommendations
• The umask setting should apply a consistent set of permissions to all users
who log in to DataStage for the purposes of administering the IDoc Extract
functionality. In addition, the umask setting should be consistent for all
users running DataStage jobs for all SAP R/3 products.
For example, if only one user will ever administer DataStage connections to
SAP or run DataStage jobs using SAP R/3 plug-ins, the umask should be
022. If only users from the same group will perform these functions, the
umask should be 002. If you want users to administer DataStage

D-4 PACK for SAP R/3


PACK4SAPR3.book Page 5 Thursday, January 15, 2004 11:31 AM

connections to SAP or run DataStage jobs using SAPR/3 plug-ins regardless


of user id or group, the umask must be explicitly set to 000.
• For consistent permissions, the dsidocmgr executable should be invoked
with the same umask as the DataStage server. The dsidocd.rc script used to
start the dsidocmgr references the dsenv file containing the environment
settings for the DataStage server. Therefore, the dsidocmgr executable
should use an explicit umask contained in this file.
• Only one user should change the cleanup settings on the system. This lets
the DataStage Administrator for SAP reflect the state of the system
accurately.

File Permissions for IDoc Extract Plug-In D-5


PACK4SAPR3.book Page 6 Thursday, January 15, 2004 11:31 AM

D-6 PACK for SAP R/3


PACK4SAPR3.book Page 1 Thursday, January 15, 2004 11:31 AM

Index

Symbols ABAP Extract plug-in 2-3


BAPI plug-in 5-3
.bmk file suffix D-3 IDoc Extract plug-in 3-3
.config file archiving IDocs 3-10
example 3-14 archiving processed IDocs
IDoc Extract 3-27
A assigned input links 4-12
authorization profile, ABAP Extract 2-4
ABAP definition 1-5 automatic job execution
ABAP Editor 2-20 IDoc extraction 3-8
ABAP Extract 2-18
ABAP Extract plug-in
architecture 2-3 B
functionality 2-2 background processing, ABAP
overview 2-1 Extract 2-15
terminology 1-5 BAPI definition 1-5
ABAP Extract programs BAPI Explorer dialog
running in background 2-15 BAPI plug-in 5-11
ABAP Extract Stage dialog 2-7 BAPI input meta data
ABAP Program Development Status, BAPI Input Columns page 5-18
ABAP Extract 2-21 BAPI Input page
ABAP Program page 2-18 BAPI plug-in 5-13
activating BAPI parameters BAPI plug-in 5-14, 5-16, 5-17
BAPI plug-in 5-14 BAPI plug-in run-time component 5-26
Add Columns dialog 3-33 BAPI Stage dialog 5-4
IDoc Load plug-in 4-17 BAPI Stage General page 5-6
ADM_ prefixes BAPISeqNo field
IDoc extraction 3-33 BAPI plug-in 5-18
administrative fields batch count
IDoc extraction 3-33 DataStage limit for IDocs 3-8
Administrator for SAP utility 6-1 bookmark files 3-10
Advanced Settings for IDoc Cleanup and bookmark files for IDoc Extract
Archiving dialog PSA 3-19
DataStage Administrator for SAP 6-17 BOR browsing
alias for remote path, ABAP Extract 2-14 BAPI plug-in 5-11
architecture BOR definition 1-6

Index-1
PACK4SAPR3.book Page 2 Thursday, January 15, 2004 11:31 AM

browsing SAP BOR accessing from ABAP Extract


BAPI plug-in 5-11 GUI 2-11
Build Extraction Object BAPI plug-in 5-8
program generation, ABAP IDoc Load plug-in 4-7
extract 2-19 connections
Build Extraction Object dialog 2-20 managing for IDoc Extract 3-23
Build SQL query selecting for ABAP Extract 2-11
program generation, ABAP connections to SAP
Extract 2-19 ABAP Extract 2-9
Build SQL Query dialog 2-20 Control record definition 1-6
building control records
extraction objects or SQL queries, IDoc extractions 3-29
ABAP Extract 2-20 CPI-C data transfer method, ABAP Extract
SQL queries, ABAP Extract 2-20 2-14
Business Object definition 1-6 creating programs manually, ABAP
Extract 2-20
C creating tables 2-23
crontab command D-4
changing data transfer methods, ABAP
Extract 2-15 D
character set maps
ABAP Extract plug-in 2-8 data transfer methods
cleanup of IDoc files 6-16 ABAP Extract 2-13
cleanup scheduling DataStage Administrator for SAP 6-14
crontab command D-4 DataStage Administrator for SAP
cofiles and data files utility 6-1
copying, for ABAP Extract 2-4 DataStage Connections to SAP page
columns DataStage Administrator for SAP 6-2
adding for IDoc extraction 3-33 DataStage Job Options for Idocs page
columns and corresponding segment types DataStage Administrator for SAP 6-9
IDoc Load plug-in 4-18 DataStage Job Options page
configure IDoc types DataStage Administrator for SAP 6-8
IDoc Load plug-in 4-10 defining
configuring ABAP Extract Stage properties 2-7
IDoc type properties 6-1 IDoc Extract stage
R/3 connection properties 6-1 3-21
configuring global properties 6-1 properties for IDoc types for IDoc
connecting to SAP R/3 systems from Extract 3-26
DataStage defining ABAP Extract Stage
IDoc Extract plug-in 3-21 properties 2-7
Connection Properties dialog defining BAPIs
BAPI plug-in 5-13

Index-2 PACK for SAP R/3


PACK4SAPR3.book Page 3 Thursday, January 15, 2004 11:31 AM

defining character set mapping E2MAKTM meta data translation to


IDoc Load plug-in 4-13 DataStage columns 3-17
defining character set maps IDoc Load assigned input links 4-19
ABAP Extract plug-in 2-8 IDocTypes.config file for IDoc
BAPI plug-in 5-5 extraction 3-13
defining connection properties MATMAS01 3-16
IDoc Load plug-in 4-7 relationships between DataStage jobs
defining IDoc Load input data 4-13 and IDoc types 3-9
defining IDoc Load stage 4-5 RFC listener servers 3-9
defining IDoc type configuration Export Connection dialog
properties DataStage Administrator for SAP 6-13
IDoc Extract 3-24 Export log files
defining input properties BAPI plug-in 5-27
BAPI plug-in Export page
BAPI Input General page 5-9 Input BAPI page 5-15
defining log files 5-16
defining output properties F
ABAP Extract 2-8
defining SAP connections filtering Business Objects
BAPI plug-in 5-6 BAPI plug-in 5-10
defining stage properties Find IDoc Type dialog 6-14
BAPI plug-in 5-4 IDoc Extract plug-in 3-25
dsenv file F-keys
IDoc Extract plug-in 3-11 IDoc Load plug-in 4-16
dsidocd.rc script foreign keys
permissions D-5 IDoc Load plug-in 4-16
RFD Listener Manager 3-7 format error
IDoc extraction 3-30
E FTP data transfer method, ABAP Extract
2-14
E2MAKTM functionality
example of translation to DataStage ABAP Extract plug-in 2-2
columns 3-17 BAPI plug-in 5-2
Enter program as text IDoc Extract plug-in 3-2
program generation, ABAP IDoc Load plug-in 4-2
Extract 2-20
ERP definition 1-6 G
errors
IDoc extraction 3-30 generation methods, ABAP Extract
example programs 2-19
.config file 3-14

Index-3
PACK4SAPR3.book Page 4 Thursday, January 15, 2004 11:31 AM

I DataStage Administrator for SAP 6-11


importing new connections to SAP
IDoc Cleanup and Archiving page DataStage Administrator for SAP 6-10
DataStage Administrator for SAP 6-16 importing the RFC and authorization
IDoc control records 3-29 profile, ABAP Extract 2-4
IDoc definition 1-6 importing transport requests, ABAP
IDoc Extract plug-in 3-10 Extract 2-5
overview 3-1 importing using Transaction STMS
IDoc extraction components ABAP Extract 2-5
selecting 3-32 Input BAPI Export page
IDoc Listener Settings page BAPI plug-in 5-15
DataStage Administrator for SAP 6-7 Input BAPI Import page 5-14
IDoc Load Columns page 4-16 Input BAPI Logs page
IDoc Load Input General page 4-13 BAPI plug-in 5-16
IDoc Load plug-in 4-12, 4-18 Input BAPI Tables page
overview 4-1 BAPI plug-in 5-15
IDoc Load Settings page Input Columns page
IDoc Load Connection Properties BAPI plug-in 5-18
dialog 4-9 installation
IDoc Log dialog requirements for PACK 1-1
DataStage Administrator for SAP 6-15 installation prerequisites
IDoc segment definitions ABAP Extract plug-in 2-3
saving for IDoc extraction 3-35 installation requirements
IDoc type definition 1-6 IDoc Load plug-in 4-3
IDoc Type page installing
IDoc Extract plug-in 3-23 plug-in clients 1-5
IDoc Type Properties dialog server components on Windows 1-3
IDoc Extract plug-in 3-24, 3-26 iRFC function module, ABAP Extract 2-4
IDoc Types dialog
DataStage Administrator for SAP 6-3,
6-14 J
IDoc types for IDoc Extract Job Inactivity Timeout setting 6-17
finding substrings 3-25
IDocTypes.config file for IDoc extraction
example 3-13 K
Import IDoc Type Configurations dialog Key Columns for Link dialog
DataStage Administrator for SAP 6-11 IDoc Load plug-in 4-20
Import page
BAPI plug-in 5-14
Importing at the operating system level, L
ABAP Extract 2-6 libraries 1-2
importing into existing connections listener log files

Index-4 PACK for SAP R/3


PACK4SAPR3.book Page 5 Thursday, January 15, 2004 11:31 AM

IDoc Extract plug-in 3-6 Output General page


listener servers ABAP Extract plug-in 2-9
IDoc Extract plug-in 3-6 IDoc extraction
listener sub-system IDoc Extract Output General
IDoc Extract plug-in 3-5 page 3-31
load balancing Output page
BAPI plug-in 5-8 ABAP Extract plug-in 2-8
DataStage Administrator for SAP 6-6 overview
load methods, ABAP Extract 2-20 ABAP Extract plug-in 2-1
loading BAPI plug-in 5-1
ABAP programs, ABAP Extract 2-20 IDoc Extract plug-in 3-1
local data files, ABAP Extract 2-15
local storage for IDocs 3-9 P
log files directory
BAPI plug-in 5-17 PACK definition 1-6
logs parent and child segment types
IDoc listener 6-15 IDoc Load plug-in 4-19
partner number
M IDoc Load plug-in 4-9
path of remote file
managing connections for IDoc entering, ABAP Extract 2-14
Extract 3-23 permissions
manual job execution IDoc Extract bookmark files 3-19
IDoc extraction 3-8 permissions on bookmark file 3-19
MATMAS01 definition 1-6 Port Version list
MATMAS01 example 3-16 DataStage Administrator for SAP 6-10
methods port versions
generating ABAP Extract IDoc extraction 3-30
programs 2-19 preventing updates in bookmark files 3-18
modifying columns primary keys
IDoc Load plug-in 4-16 IDoc Load plug-in
P-keys
N IDoc Load plug-in 4-16
processing IDocs using DataStage
NLS page summary 3-2
ABAP Extract plug-in 2-8 program generation
BAPI plug-in 5-5 ABAP Extract 2-18
IDoc Load plug-in 4-13 Program Generation Options dialog 2-20
programs
O running in background, ABAP
Extract 2-15
opening, ABAP Extract 2-20 PSA

Index-5
PACK4SAPR3.book Page 6 Thursday, January 15, 2004 11:31 AM

bookmark files for IDoc Extract 3-19 Select Application Component ID dialog
PSA definition 1-6 BAPI plug-in 5-10
PSA maintenance 3-10 Select DataStage Connection to SAP
PSA property 3-9 dialog
ABAP Extract plug-in 2-11
R BAPI plug-in 5-7
IDoc Extract plug-in 3-22
R/3 source systems configuration IDoc Load plug-in
IDoc Extract plug-in 3-5 Connection and Logon Details page
R/3 upgrades Connection Properties
changing segment meta data for IDoc dialog 4-7
Extract 3-28 Select IDoc Component to Extract
R/3 Versions 3-29 dialog 3-32
restore bookmark files to original Select IDoc Component to Load dialog
state 3-18 IDoc Load plug-in 4-15
return parameters 5-17 Select IDoc Type dialog
RFC definition 1-6 IDoc Extract plug-in 3-24
RFC, importing for ABAP Extract 2-4 IDoc Load plug-in 4-10, 4-11
root segments selecting
IDoc Load plug-in 4-19 ABAP Extract data transfer
run methods, ABAP Extract 2-20 methods 2-13
running background processes, ABAP control records or segments for IDoc
Extract 2-15 extraction 3-32
running DataStage jobs automatically 6-8 selecting a BAPI
BAPI plug-in 5-11
S selecting DataStage connection to SAP
IDoc Load plug-in 4-5
saprfc.in file selecting IDoc components
permissions D-3 IDoc extraction 3-32
saprfc.ini file 3-19, D-3 selecting IDoc Types
IDoc Extract plug-in 3-19 IDoc Extract plug-in 3-23
Save IDoc Segment Definition selecting IDoc types
dialog 3-35 IDoc Load plug-in
IDoc Load plug-in 4-18 IDoc Load Stage IDoc Type
scheduling jobs page 4-10
IDoc Extract 3-28 selecting segment definitions
Segment definition 1-7 IDoc Load plug-in 4-15
segment definition names specifying
IDoc extraction 3-29 ABAP program properties, ABAP
segment meta data for R/3 upgrades Extract 2-18
changing for IDoc Extract 3-28 specifying for IDoc Extract 3-29
segment type definition 1-7

Index-6 PACK for SAP R/3


PACK4SAPR3.book Page 7 Thursday, January 15, 2004 11:31 AM

Stage General page transferring data


IDoc Extract plug-in 3-21 methods for ABAP Extract 2-13
IDoc Load plug-in 4-5 transport control program, ABAP
Stage IDoc Type page Extract 2-4
IDoc Extract plug-in 3-23 Transport Management System, ABAP
starting and stopping RFC Listener Extract 2-4
Manager on UNIX 3-7 transport requests, importing for ABAP
starting and stopping RFC Listener Extract 2-5
Manager on Windows 3-6 tRFC port definition 1-7
status, program development tRFC ports 3-6
ABAP Extract 2-21
summary V
output links for IDoc extraction 3-31
processing IDocs using DataStage 3-2 validating
summary of IDoc Extract stage and ABAP Extract stage 2-10
links 3-20 validating columns 4-18
synchronizing columns Validation Stage dialog 2-10
Idoc Load plug-in 4-20 Variant definition 1-7
versions of R/3
T DataStage Administrator for SAP 6-10
viewing Export parameters
tables BAPI plug-in 5-15
creating 2-23 viewing Import parameters
Tables log files BAPI plug-in 5-14
BAPI plug-in 5-27 viewing Tables parameters
Tables page BAPI plug-in 5-14, 5-15
Input BAPI page 5-15
temporary IDoc files directory for IDoc W
extraction
pathname for temporary IDoc files for Windows
IDoc extraction 3-27 installing server components 1-3
temporary segment index files
IDoc Load plug-in 4-9 Z
Terminology
ABAP Extract plug-in 1-5 Z_DS-PROFILE profile, ABAP
Test Mode Extract 2-6
IDoc Extract Stage General page 3-22 Z_RFC_DS_SERVICE RFC function
processing IDocs 3-18 module, ABAP Extract 2-6
testing load balancing 6-7 ZDSRFC.TXT file, ABAP Extract 2-4
tp import, ABAP Extract 2-4 ZETL development class, ABAP
Transaction SE80, ABAP Extract 2-6 Extract 2-6
Transaction STMS, ABAP Extract 2-4

Index-7
PACK4SAPR3.book Page 8 Thursday, January 15, 2004 11:31 AM

Index-8 PACK for SAP R/3

Вам также может понравиться