Академический Документы
Профессиональный Документы
Культура Документы
(Part 3)
August 26, 2010 By msalvo 1 Comment
One of the inevitable aspects of data migration is dealing with fallout from automated data
loads. Typically, this process includes identifying the data that will not load, analyzing the error
messages to determine the root cause, formatting a readable report that can be used as a tool in
the cleanup process, and fixing the root cause of the problem so that it does not happen again.
materials for the first time; changes in the handling of tax, tax codes, and tax jurisdiction
codes; that account determination entry that is missing or not set up correctly; a missing
unit of measure or unit or measure conversion factor; the storage location in the upload
file which does not exist in SAP any of these can cause a load to drop mostly or
completely onto the floor.While change is inevitable on any project, it is important to
control and communicate the change so that the downstream impact can be recognized
and understood. Controlled change and communication always works better than total
surprise. Perhaps if we all know ahead of time about that data field that is now required,
we can impose a requirement on the data extract side to make sure that the data field is
populated before it enters the upload file.
2. Additional data in the upload file.
Inserting a new field in the middle of the upload file data structure might be necessary for
the business to close a gap, but if that change is not communicated to the technical team
so that appropriate adjustments can be made to the load objects input structures and
processing logic, the new data will surely never load, and may cause misalignment of the
data fields which follow it in the upload structure.
100
90
10
90% success
10% fail
This technical metric indicates only that all of the SAP validation rules for posting the
transaction have passed. It does not indicate that the master or transactional data posted to SAP
is actually correct in functional terms.
The session log shows the status of all 229 transactions. This screen snapshot is a fragment of
the complete session log for the batch input session. It shows many successful transactions
(Type = S) and one failed transaction (Type = E). The error message here is clear the article
does not exist or is not activated.
But as you can imagine, the 13 failed transactions with error type = E are sprinkled throughout
the many pages of this log file. With only 229 transactions, this log file is quite easy to pick
through to find the 13 errors. But imagine if the number of transactions were in the thousands or
tens of thousands. How do we extract only the failed transactions and present a concise report of
the failed transactions?
To do this, I use SAP transaction SM35P Batch Input Log Overview. This transaction has the
ability to set a filter on any field in the batch input log file, display the filtered results, and then to
export the results to a local file.
To enter the mode where this is possible, first press the PRINT icon.
Next, set the filter. The appropriate filter field here is SESS. TYPE.
We only want the errors, so set the filter for field SESS. TYPE = E.
The display now shows only the 13 rows containing the error messages. This can be exported
directly to a local spreadsheet for further analysis.
By pressing the status list icon (shown above), the display will show the status messages for
the fallout. Once these messages are displayed, pressing the export icon allows me to save
the screen contents to a spreadsheet.
Now it would be really nice if I could have the article number in the spreadsheet right next to the
error message. The article number in the ARTMAS IDOC is stored in segment
E1BPE1MATHEAD. The segment content for each IDOC can be displayed by pressing the list
specific segment icon and entering the segment name in the box.
The segment display will show all fields in the segment, so I usually hide all of the columns that
I dont want to see. Here is the E1BPE1MATHEAD segment display showing only the article
number. I can use the export icon to save the list of article numbers to another spreadsheet.
Here is a portion of the complete spreadsheet showing the error messages and the article
numbers side by side.
A filter applied to the spreadsheet shows that the 2,593 errors are all grouped into one of three
error status categories. By selecting a single category, Excel will also show me the number of
records within that failure category.
Sometimes it is easier to mine the status messages directly from the IDOC status table EDIDS.
This is especially true where the processing module is a BAPI which returns an error table rather
than a single error message. In this case, when you press the status list icon in WE05, only the
first error status message of several is displayed for each IDOC. I find that the first message is
not very helpful (as shown below). I also find that typically the second or third message in the
return status table is usually the important one. You wont see it displayed on the WE05 screen,
but you can mine it from the EDIDS table.
SAP transactions SE11 or SE16 both support this activity. For the selection criteria I use the
IDOC number range, status 51, and status type E. On the display screen, choose only the
relevant fields for display the IDOC number (DOCNUM), IDOC status (STATUS), status
message (STATXT), the four substitution parameters for the status message (STAPA1, STAPA2,
STAPA3, STAPA4) and the message type (STATYP). All of this can be exported into a
spreadsheet. If you really want to test your Excel skills, you can write code that will move the
substitution parameters into their placeholders in the status text.
Preparing for the next data migration cycle Let the fallout
analysis and cleanup begin.
Presenting the fallout report to the business with a set of clear error reasons and links back to the
legacy data is key to enabling the legacy data cleanup process to proceed. In the iterative process
of data migration cycles, cleansing the legacy data is a step in the right direction towards an
improved next conversion cycle.
I hope you enjoyed this blog series on data migration. Please feel free to send comments or
questions.
Be Sociable, Share!
inShare1
Filed Under: SAP ABAP Blog, SAP Functional, SAP Technical Tagged With: ABAP, ALE,
Basis/Netweaver, Data Migration, DataXstream, Integration, Mike Salvo, NetWeaver, Project
Management, SAP, SAP ABAP, upgrades
Comments
1.
ken says:
July 3, 2011 at 10:31 am
very helpful series, do u have any suggestion on the training to the business users who
should responsible for the data collection? such as the training scope, key topics,
interaction type, sampling?thanks a lot.