Вы находитесь на странице: 1из 10

Creating Data Transfer Processes

Use
You use the data transfer process (DTP) to transfer data from source objects to target objects in BI. You can also use the data transfer process to access InfoProvider data directly.

Prerequisites
You have used transformations to define the data flow between the source and target object.

Procedure
Creating Data Transfer Processes Using Process Chains
You are in the plan view of the process chain that you want to use for the data transfer process. Process type Data Transfer Process is available in the Loading Process and Post processing process category.
...

1. 2.

Use drag and drop or double-click to insert the process into the process chain. To create a data transfer process as a new process variant, enter a technical name and choose Create. The dialog box for creating a data transfer process appears. 3. Select Standard (Can Be Scheduled) as the type of data transfer process.

You can only use the type DTP for Direct Access as the target of the data transfer process for a VirtualProvider. More information:Creating Data Transfer Processes for Direct Access. If you use the data transfer process in a process chain, you can only use the standard data transfer as the target of the data transfer process for a DataStore object. More information about data transfer processes for real-time data acquisition: Creating Data Transfer Processes for Real-Time Data Acquisition. 4. Select the target and source object. First select the object type. Two input helps are available when you select the source and target objects: Using the quick info Input Help: Existing Paths This input help provides a selection of the objects for the starting object that were already defined in the data flow. If there is only one object in the data flow, this is selected by default. List with the quick info Input Help: List of All Objects This input help enables you to select the object from the complete list of BI objects. 5. Choose Continue. 6. The data transfer process maintenance screen appears. The header data for the data transfer process shows the description, ID, version and status of the data transfer process, along with the delta status. 7. On the Extraction tab page, specify the parameters: a. Choose Extraction Mode. You can choose Delta or Full mode.

Unlike delta transfer using an InfoPackage, an explicit initialization of the delta process is not necessary for delta transfer with a DTP. When the data transfer process is executed in

delta mode for the first time, all existing requests are retrieved from the source, and the delta status is initialized.

Only the extraction mode Full is available for the following sources: InfoObjects InfoSets DataStore Objects for Direct Update

If you have selected transfer mode Delta, you can define further parameters: i. With Only Get Delta Once, define if the source requests should be transferred only once. Setting this flag ensures that the content of the InfoProvider is an exact representation of the source data. A scenario of this type may be required if you always want an InfoProvider to contain the most recent data for a query, but technical reasons prevent the DataSource on which it is based from delivering a delta (new, changed or deleted data records). For this type of DataSource, the current data set for the required selection can only be transferred using a full update. In this case, a DataStore object cannot normally be used to determine the missing delta information (overwrite and create delta). If this is not logically possible because data is deleted in the source without delivering reverse records for example, you can set this indicator and perform a snapshot scenario. Only the most recent request for this DataSource is retained in the InfoProvider. Earlier requests for the DataSource are deleted from the (target) InfoProvider before a new one is requested (this is done by a process in a process chain, for example). They are not transferred again by the DTP delta process. When the system determines the delta when a new DTP request is generated, these earlier (source) requests are considered to have been retrieved. ii. Define if you want to Get All New Data in Source Request by Request. Since a DTP bundles all transfer-relevant requests from the source, it sometimes generates large requests. If you do not want to use a single DTP request to transfer the dataset from the source because the dataset is too large, you can set the Get All New Data in Source Request by Request flag. This specifies that you want the DTP to read only one request from the source at a time. Once processing is completed, the DTP request checks for further new requests in the source. If it finds any, it automatically creates an additional DTP request.

You can change this flag at any time, even if data has already been transferred. If you set this flag, you can transfer data by request as a one-off activity. If you deselect the flag, the DTP goes back to transferring all new source requests at once at periodic scheduled intervals.

If you set the indicator for a DTP that was created prior to NetWeaver 7.0 Support Package Stack 13, the DTP request only retrieves the first source request. This restricts the way in which the DTPs can be used because requests accumulate in the source, and the target might not contain the current data. To avoid this, you need to execute the DTP manually until all the source requests have been retrieved. The system therefore also displays the following indicator for such DTPs: Retrieve Until No More New Data. If you also set this indicator, the DTP behaves as described

above and creates DTP requests until all the new data has been retrieved from the source. b. If necessary, determine filter criteria for the delta transfer. To do this, choose Filter. This means that you can use multiple data transfer processes with disjunctive selection conditions to efficiently transfer small sets of data from a source into one or more targets, instead of transferring large volumes of data. The filter thus restricts the amount of data to be copied and works like the selections in the InfoPackage. You can specify single values, multiple selections, intervals, selections based on variables, or routines. Choose Change Selection to change the list of InfoObjects that can be selected. The icon next to pushbutton Filter indicates that predefined selections exist for the data transfer process. The quick info text for this icon displays the selections as a character string. c. Choose Semantic Groups to specify how you want to build the data packages that are read from the source (DataSource or InfoProvider). To do this, define key fields. Data records that have the same key are combined in a single data package. This setting is only relevant for DataStore objects with data fields that are overwritten. This setting also defines the key fields for the error stack. By defining the key for the error stack, you ensure that the data can be updated in the target in the correct order once the incorrect data records have been corrected. More information: Handling Data Records with Errors and Error Stack.

8.

During parallel processing of time-dependent master data, the semantic key of the DTP may not contain the field of the data source. d. Define any further settings that depend on the source object and data type. On the Update tab page, specify the parameters: a. Make the settings for error handling. Define the following: How you want to update valid records when errors occur. How many errors can occur before the load process terminates.

More information: Handling Data Records with Errors. b. Apply any further settings that are relevant for the target object. 9. On the Execute tab page, define the parameters: On this tab page, the process flow of the program for the data transfer process is displayed in a tree structure. a. Specify the status that you want the system to adopt for the request if warnings are displayed. b. Specify how you want the system to define the overall status of the request. c. Normally the system automatically defines the processing mode for the background processing of the respective data transfer process. If you want to execute a delta without transferring data, like when simulating the delta initialization with the InfoPackage, select No data transfer; delta status in source: fetched as processing mode. This processing mode is available when the data transfer process extracts in delta mode. In this case you execute the DTP directly in the dialog. A request started like this marks the data that is found in the source as fetched, without actually transferring it to the target. If delta requests have already been transferred for this data transfer process, you can still choose this mode. If you want to execute the data transfer process in debugging mode, choose processing mode Serially in the Dialog Process (for Debugging). In this case, you can define

breakpoints in the tree structure for the process flow of the program. The request is processed synchronously in a dialog process and the update of the data is simulated. If you select expert mode, you can also define selections for the simulation and activate or deactivate intermediate storage in addition to setting breakpoints. More information: Simulating and Debugging DTP Requests. More information: Processing Types in the Data Transfer Process 10. Check the data transfer process, then save and activate it. 11. Start process chain maintenance. The data transfer process is displayed in the plan view and can be linked into your process chain. When you activate and schedule the chain, the system executes the data transfer process as soon as it is triggered by an event in the predecessor process in the chain.

Creating Data Transfer Processes from the Object Tree in the Data Warehousing Workbench
The starting point when creating a data transfer process is the target where you want to transfer data to. In the Data Warehousing Workbench, an object tree is displayed and you have highlighted the target object.
...

1.

In the context menu, choose Create Data Transfer Process. The dialog box for creating a data transfer process appears. 2. Proceed as described in steps 3 to 10 in the procedure for creating a data transfer process using a process chain. In step 4, you specify the source object only. You can now execute the data transfer process directly.

Additional Functions
Choose Goto Overview of DTP to display information about the source and target objects, the transformations, and the last changes to the data transfer process. Choose Goto Batch Manager Settings to make settings for parallel processing with the data transfer process. More information: Setting Parallel Processing of BI Processes By choosing Goto Settings for DTP Temporary Storage, you can define the settings for the temporary storage. More information: Handling Data Records with Errors You can define the DB storage parameters with Extras Settings for Error Stack. More information: DB Memory Parameters

Creating Data Transfer Processes for Direct Access


Use
You use a data transfer process for direct access to access the data in an InfoProvider directly.

Prerequisites
You have used transformations to define the data flow between the source and target object.

Procedure
The starting point when creating a data transfer process is the target into which you want to transfer data. In the Data Warehousing Workbench, an object tree is displayed and you have highlighted the target object, a VirtualProvider.
...

1.

In the context menu, choose Create Data Transfer Process. The dialog box for creating a data transfer process appears.

DTP for Direct Access is displayed as the type of the data transfer process. Select the type of source object. Supported object types are DataSources, InfoCubes, DataStore objects and InfoObjects (texts and attributes, if they are released as InfoProviders). 3. Select the object from which you want to transfer data into the target. When you select the source object, input help is available. Input help shows you the selection of objects that already exist in the data flow for target object. If only one object exists in the data flow, this is selected by default. An additional List pushbutton is available. This allows you to select a source object from the complete list of objects that exist for this object type. 4. Choose Continue. The data transfer process maintenance screen appears. The header data for the data transfer process shows the description, ID, version, and status of the data transfer process, along with the delta status. On the Extraction tab page, the system displays information about the adapter, the format of the data and additional source-specific settings. On the Update tab page, the system displays information about the target. On the Execute tab page, the system displays the processing mode for direct access and the process flow of the program for the data transfer process. You do not need to make any settings in the data transfer process. 5. Check the data transfer process, save and activate it. 2. Choose Goto Overview of DTP to display information about the source and target objects, the transformations, and the last changes to the data transfer process.

Result
You can use the data transfer process to access data directly.

Creating Data Transfer Processes for Real-Time Data Acquisition


Use
You use the data transfer process (DTP) for real-time data acquisition to transfer data to the DataStore object from the PSA. In the DataStore object, the data is available for use in reporting.

Prerequisites
You have used transformations to define the data flow between the DataSource and the DataStore object. The selections for the data transfer process do not overlap with selections in other data transfer processes.

Procedure
The starting point when creating a data transfer process is the DataStore object into which you want to transfer data. In the Data Warehousing Workbench, an object tree is displayed and you select the DataStore object.
...

1.

In the context menu, choose Create Data Transfer Process. The dialog box for creating a data transfer process appears.

2. 3.

Select DTP for Real-Time Data Acquisition as the DTP Type. As the source object, select the DataSource from which you want to transfer data to the DataStore object. The input help for the source object shows the selection of DataSources that already exist in the data flow for the DataStore object. An additional List pushbutton is available. This allows you to select a DataSource from the complete list of BI DataSources. 4. Choose Continue. The data transfer process maintenance screen appears. The header data for the data transfer process shows the description, ID, version, and status of the data transfer process, along with the delta status. 5. On the Extraction tab page, specify the parameters: a. Delta is chosen as the extraction mode for real-time data acquisition. b. If necessary, determine filter criteria for the delta transfer. To do this, choose Filter. This means that you can use multiple data transfer processes with disjunctive selection conditions to efficiently transfer small sets of data from a source into one or more targets, instead of transferring large volumes of data. You can specify individual selections, multiple selections, intervals, selections based on variables, or routines. To change the list of InfoObjects that can be selected, chooseChange Selection.

The icon next to pushbutton Filter indicates that predefined selections exist for the data transfer process. The quick info text for this icon displays the selections as a character string. c. Semantic grouping is not used for real-time data acquisition. The data is read from the source in packages. 6. On the Update tab page, specify the parameters: Make the settings for error handling. You define: How you want to update valid records when errors occur. How many errors can occur before the load process terminates.

Note that this setting only has an impact while repairing a DTP request (repair) and during the conversion of the DTP to standard DTP (for example, to correct an error during extraction). For real-time data acquisition, specify in the InfoPackage the maximum number of failed attempts that is permitted when the daemon is accessing data before the data transfer terminates with an error. For more information about error handling settings, see Handling Data Records with Errors. 7. On the Execute tab page, determine the parameters: On this tab page, the process flow of the program for the data transfer process is displayed in a tree structure. a. Specify the status that you want the system to adopt for the request if warnings are to be displayed in the log. b. Specify how you want the system to define the overall status of the request. 8. Check, save, and activate the data transfer process. 9. With Assign Daemon you go to the monitor for real-time data acquisition if there is already an InfoPackage for RDA for the DataSource that is the source for this DTP.

If there is not yet an InfoPackage for RDA for the DataSource that is the source for this DTP, the system informs you that you must first create an InfoPackage for RDA before you can assign the DTP.

Alternatively you can go to the monitor for real-time data acquisition from the context menu entry Assign RDA Daemon of the data transfer process if you are in the Data Warehousing Workbench.

Result
The data transfer process is assigned to the DataSource. If the DataSource is already assigned to a daemon, the data transfer process appears in the monitor for real-time data acquisition under this daemon and the DataSource. It is now available for data processing by the daemon. If the DataSource has not yet been assigned to a daemon, the data transfer process appears in the monitor for real-time data acquisition under the DataSource in the area Unassigned Objects. The data transfer process, the corresponding DataSource, the InfoPackage and possibly further associated data transfer processes are assigned to the specified daemon in the context menu of the data transfer process with Assign Daemon..

Error Stack
Definition
A request-based table (PSA table) into which erroneous data records from a data transfer process are written. The error stack is based on the data source, that is, records from the source are written to the error stack.

Use
At runtime, erroneous data records are written to an error stack if the error handling for the data transfer process is activated. You use the error stack to update the data to the target destination once the error is resolved.

Integration
In the monitor for the data transfer process, you can navigate to the PSA maintenance by choosing Error Stack in the toolbar, and display and edit erroneous records in the error stack. With an error DTP, you can update the data records to the target manually or by means of a process chain. Once the data records have been successfully updated, they are deleted from the error stack. If there are any erroneous data records, they are written to the error stack again in a new error DTP request. When a DTP request is deleted, the corresponding data records are also deleted from the error stack.

Examples for Using the Error Stack


Consistent Error Handling for Aggregation
Number of Records in Source is Greater than Number of Records in Target
During the transformation, the data records for request 109882 are aggregated to one data record. If, for example, there is no SID for the characteristic value order number 1000, the record is interpreted as erroneous. It is not updated to the target. Those data records that form the aggregated data record are written to the error stack.

Number of Records in Source is Less than Number of Records in Target


During the transformation, the data record for request 109882 is duplicated to multiple data records. If, for example, there is no SID for the characteristic value calendar day 07-03-2005, the record is interpreted as erroneous. The duplicated records are not updated to the target. The data record that formed the duplicate records is written to the error stack. In the error stack, the record is listed as containing an error every time it duplicates data records with errors.

Consistent Error Handling with Respect to Order in Which Data Records are Written to Error Stack
Update to DataStore Object: 1 Request
The Order Number field is the key for the error stack. During the transformation, data record 02 of request 109882 is marked as containing errors. In addition to the erroneous data record, all subsequent data records for the request that contain the same key are written to the error stack. In this example, this is data record 03. This ensures that when error records are updated with the error DTP, the records are serialized correctly and newer data is not inadvertently overwritten by older data. Data record 01 has the same key as the incorrect data record 02 (order number 1000), but is correct and it occurred before the incorrect data record. Data record 01 is therefore copied into the target of the DTP. The order of the data records is not changed.

Updating to DataStore Object: Multiple Requests Error in First Request


The Order Number field is the key for the error stack. During the transformation, data record 02 of request 109882 is marked as containing errors. In addition to the erroneous data record, all subsequent data records, including the following requests that have the same key, are written to the error stack. In this example, data record 01 for request 109883 is written to the error stack in addition to data record 02 for request 109882.

Updating to DataStore Object: Multiple Requests Error in Subsequent Request


The Order Number field is the key for the error stack. During the transformation, data record 01 of request 109883 is identified as containing errors. It is written to the error stack. Any data records from the previous request that have the same key were updated successfully to the target.

Вам также может понравиться