Вы находитесь на странице: 1из 40

Infocube KeyFigure is not Populating in

Planning Area
Dear friends,

I am working on 4.1 APO.


There is keyFigure with Semantic - 002 - Infocube.
Data is there in the Info Cube but not getting populated in Planning Book.
Proper Setting done in Planning Area and in RSA1.

Pls tell any possible reason.


Thanx
Vishal..

Hello Vishal,
What is the data load status for in the infocube. Try this- go to your infocube, rightclick
and select manage. Tell me if the request status is green or yellow???
It should be green. Let me know.

Hi vishal,

There are 2 possibilities


1. The Key figure has been formed using reference of other Key figure. Double click
your Key fig and see at the top right hand corner if there is some field mentioned under
reference tab? if yes then make a new Key fig without any reference and then load data.

2. If above is not the problem run the load activity in foreground and analyze the error log
generated. Based on error log statements I might be able to suggest some other suitable
way to achieve the desired results.

Regards
Ankur

Vishal -- Did you ever solve this problem? I am having the same problem. I can see the
data in the infocube (using transaction LISTCUBE) but the data is not populating in the
planning book. There is no error message; the data just does not show up in the planning
book.

All data in the infocube was loaded from Excel. Some data from the same KF in the
infocube does show up in the planning book. This data was entered several months ago.
But now, using the same Excel process, the data that is loaded into the infocube appears
in the infocube, but does not populate the key figure in the planning book in APO.
Hi,
A couple of checks:
1. Is the data being loaded into the planning area in the same time range as the
planning area initialization?
2. Do you have CVCs created for the data being loaded?
3. If these are still not giving you any solution, then try the transaction
"/SAPAPO/TSCUBE". You can manually load the data to Planning area from a
cube.
Hope this helps.

Hi Visu -- Thanks for the suggestions. All good ideas. I've already
checked all these. The time series objects are initialized with the proper
time frame (in fact, I deactivated the planning area and then re-initialized
the time series), all the CVCs are created, and I also tried TSCUBE but it
also was not successful. I also verified that the units of measure match. I
also re-activated the update rules in the infocube, but that did not help. It
seems as if "something" has changed, perhaps in the cube, because other
data previously loaded for the exact same CVC does show up in the
planning book. When I look at the data using LISTCUBE, the data that is
loading into the planning book looks EXACTLY IDENTICAL to the data
that is not loading (only the quantities are different, which is how I can
tell which records are loading and which records are not loading).

Any other ideas?


Thanks.

For those who may be watching this thread: The problem has been
solved. The problem was that there were several failed
infopackages that had been attempted to be uploaded (via flat file
from Excel) into the infocube. Apparently, these failed
infopackages were somehow interfering with APO's ability to read
any subsequent data from the infocube, i.e., APO could not "see"
any data in the infocube that had been uploaded to the infocube
after the failed infopackages. It seemed similar to the way a
problem in the CIF will block all subsequent traffic in the CIF. In
any case, after deleting the failed info packages from the infocube,
everything was fine and all data that had been loaded after the
failed infopackages was suddenly visible in the planning book in
APO.

Problem solved with details provided by reportee.

Forecast not appearing in SNP book


Ashfaq Ali
Even after a successful release of forecast from DP to SNP in /sapapo/mc90 and also in the
process chain, the forecast is not appearing in SNP.

we have two planning areas, for DP and SNP it is working for one planning area we are able to
release the data from DP to SNP and it is appearing in the SNP planning book but it is not
working for the second planning area and not appearing in the SNP planning book

we ran all the consistency checks to remove internal inconsistency. The forecast for all the
products are not being populated in the SNP planning book even after a successful release in
/SAPAPO/MC90.

Can you please point out how to resolve this issue.

Correct Answer by Ashfaq Ali


the issue is resolved now , there were inconsistencies that were identified in /sapapo/om17
pegging area inconsistency between APO DB and live cache.

Thanks all for your answers


See the answer in context
Helpful Answers by Aparna Ranganathan, Ada Lv

Ashfaq,

- Can you verify if your product/location chars are 9AMATNR/9ALOCNO ?


- For this check in the related POS --> Check the assingment of Prod/Location from
under on of the menus at the top (Goto/Settings i guess)
- If its not 9AMATNR/9ALOCNO, then make the changes accordingly in
/SAPAPO/MC90 under the section Extended
- Verify the version for source and target, check the dates range
- Try releasing it for one product/location
- Try using *product code* instead of just entering the product number in the product
selection section

Hope this helps

Aparna Ranganathan
Ashfaq
Looks like an issue in the category assignment in planning area to me. Go to
/n/sapapo/msdp_admin , find the problematic planning area and check what category /
category group is assigned to your forecast key figure . If your forecast gets released with
category FC to SNP , then that category has to be assigned to forecast key figure in your
SNP planning area. I bet this is the problem , since you are able to see the forecast values
in the other SNP planning book

Thanks
Aparna

Hello,

Please first check whether you can see the forecast orders in product view /sapapo/rrp3.
Please also check the 'Forecas' tab there -- the forecasts may be consumed by other
requirements.
If you can see the forecasts in 'Element' tab, then you can continue to check the planning
area design.
Make sure correct category group is assigned to the key figure you want to show forecast,
also the category FA and FC should be included in the category group.
Please also make sure no macro is affecting the key figure.

Best Regards
Ada

Ashfaq Ali
the issue is that the problem exisits with only 1 particular material and location
combination , however the material is releasing and appearing in the planning
book with other locations.
can anyone point out what can be the issue

Ashfaq Ali
the released forecasts are appearing in the RRP3 t.code under elements tab but are
not visible in the planning book
Ashfaq Ali
the issue is resolved now , there were inconsistencies that were identified in
/sapapo/om17

pegging area inconsistency between APO DB and live cache.

Thanks all for your answers

 /SAPAPO/CCR - CIF_DELTAREPORT3 Logistics Execution - Warehouse


Management
 /SAPAPO/RRP3 - Product View QM - Basic Data
 /SAPAPO/SDP94 - Supply & Demand Planner: Init.Screen Project Systems - Interface
to External Project Software
 /SAPAPO/MSDP_ADMIN - SNP and DP Administration Basis - Online Text
Repository
 /SAPAPO/CQ - SCM Queue Manager FI - Line Item Settlement of AUC
 /SAPAPO/MC62 - Maintain Characteristic Values PP - Master Data
 /SAPAPO/OM17 - Data Reconciliation FI - Financial Accounting
 /SAPAPO/RES01 - Change Resources Basis - ALE Integration Technology
 /SAPAPO/MAT1 - Product MM - Invoice Verification
 /SAPAPO/RLCDEL - Delete Orders from Livecache Financials - Transaction Manager
 /SAPAPO/MC90 - Release to Supply Network Planning Logistics Execution - Freight
Processing
 /SAPAPO/MVM - Model/Planning version FI - Basic Functions
 /SAPAPO/SDORDER_DEL - Deletion Report for SD DB Data Investment
Management - Appropriation Requests
 /SAPAPO/ADVM - Macro Workbench Logistics - Material Master
 /SAPAPO/TSCUBE - Load Planning Area Version FI - Contract Accounts Receivable
and Payable
 /SAPAPO/RRP4 - Receipts view CO - Overhead Cost Orders
 /SAPAPO/SDP8B - Define Planning Book Basis - Budgeting and Availability Control
 /SAPAPO/C3 - Display Application Log FI - G/L Account Posting
 /SAPAPO/C5 - Send Planning Results to OLTP Payroll - Payroll: General Parts
 /SAPAPO/SCC03 - PPM Maintenance App. Platform - SAP Business Partner
 /SAPAPO/LOC3 - Master Data: Locations Logistics Execution - Warehouse
Management
 /SAPAPO/C4 - Maintenance of global parameters IS - SAP Media
 /SAPAPO/CP1 - Maintenance of Distribution Criteria IS - Interest/Charge Calculation
 /SAPAPO/ATPQ_PAREA_W - Product Allocations in Plan. Area Basis - Data
Conversion

Read more: http://www.tcodesearch.com/tcodes/search?q=/SAPAPO#ixzz2SqtdmQMi

Deletion of forecasts released from APO DP


to APO SNP
We are using APO 5.1 connected to ECC 5.

For a given product, we have APO DP forecasts released to SNP, as standard ATP category FA.

The forecasts released to DC locations (loc type 1002) are shown in the APO product view as
having APO application of 'Planned Independent Requirements', and I can delete these via pgm
/SAPAPO/RLCDELETE.

The forecasts released to factory locations (loc type 1001) are shown in the APO product view as
having APO application of 'Initial' however, and I cannot delete these via pgm
/SAPAPO/RLCDELETE.

My questions are:

- why is the APO application different for the forecasts at the different location types?
- how can I delete the forecasts at the factory location by a batch program?

Any ideas welcome...


Hi,
This can be done by using the report "/SAPAPO/RM60RR20" in a batch job to delete the
forecast
or You can use the transaction /SAPAPO/MD74 to delete them through the
foreground/background jobs

Thanks
Ramesh
Hi,

This is news for me also.


I have not seen "Planned Independent Requirements" as application till now. By any chance, are
these Planned Independent Requirements at the DCs coming from R/3, and not DP as you think?

I have also sometimes faced issues in deletion of forecasts, though I am also not sure why, and I
didn't dig further into it.
You could try using transaction /SAPAPO/MD74 or report /SAPAPO/RM60RR20 in a batch job
to delete these forecasts. Also, I think deletion should also be possible through the report
/SAPAPO/DELETE_PP_ORDER.

Thanks - Pawan
Hi,

Further to Pawan's reply, if the orders are still not getting deleted by report
/SAPAPO/DELETE_PP_ORDER, then try below.

1. Open report /SAPAPO/DELETE_PP_ORDER


2. Enter the necessary details on the screen that appears. Make sure 'Delete orders' check box is
selected.
3. On the command bar (where we write tcode for easy access) type DEL_EXECU and press
'Enter'
4. Execute (F8)

Orders would get deleted.

Regards
Manotosh

Hi,
 This can be done by using the report "/SAPAPO/RM60RR20" in a batch job to delete the
forecast
 or You can use the transaction /SAPAPO/MD74 to delete them through the
foreground/background jobs

Thanks
Ramesh
How to delete Master Data in APO?
 How to rollback /delete master data ( ex. Product master / Material master ) which has
been CIFed from R/3 ?

On Product master when you choose Product -> Mark for Deletion and on the pop up
screen,

If you choose the product and then save; the system sets a deletion flag for the product. if
you choose location and then save; the system sets a deletion flag for the specific location
data for the product.

If you choose the product and then click continue; the system deletes the product.
If you choose location and then click continue; the system deletes the specific location
data for the product.

I think he is asking how to automatically delete the material via CIF, so that if you mark
the material for deletion in R/3 it deletes the material in APO.

I agree. But what he is really asking though; how do you undo a CIF that has happened.
Right? I don't think you can undo that command. If you had your system backed up, you
can restore from the backup and loose the delta changes from the backup time to the
current time. That becomes a question of, "is it worth the loss"?

You are right. Manually deleting these products is the only way. But it would be nice that
products which were marked for deletion in R/3 would no longer be planned in APO.
Using OSS 331664 talks about a work around, but it should be automatic.

Use om17 transaction ( report ) to delete the records.

I'm not sure what using OM17 will accomplish.

Thanks for your help. The tran (om17) is for ' action in KANBAN processing'. Is it the
right tran. ?

You must be on an R3 system. Try the transaction on APO system.

OM17 is livecache consistency check...

For delete this information must be use /sapapo/om_lccheck

OM17 if for livecache consistency check... Can you explain how you can use this to
delete records cif'd from R/3?
You are correct about om17, it is not used to delete data. To delete transactional data in
APO, use txn /SAPAPO/RLCDEL. To delete master data, you have to delete it through
through mass maintenance or manually in the maintenance screen for the master data
element (material, ppm, resource, etc.)

Delete master data from apo ( dp )


We have lot of old products in APO - dp for which master data ( of the info object ) needs to
deleted.
These products dont have any CVC. When we try to delete the master data from the infoobject
the system takes some 5-6 min and then i get a message that no master data deleted.

As there is no CVC so i should be able to delete these old products from DP module.This is
happening after applying the hirarchy. or activate masterdata.

If i load any flat file that has incorrect data then before activating i am able to delete.

Is there any way to reverse the apply hirarchy or activate masterdata.


Radhakrishnan

Hi
1. One alternative, you just try to delete that infoobject, you will get error message as below:

Still used in Infocube XXXXXX( default)

When you right click on an info object, you get "Additional Functions" , there in you have one
option of "Display Logs".

This takes you to SLG1 transaction and shows the detailed error log.

Please check if you have set detailed logging ( /SAPAPO/C4 transaction).

This way you will get list of info cubes or other objects where the in object is being used.

2. Otherwise, please click on the characteristic (info object) and click "where-used" button to
find out all the objects where your info object is used. Then go to particular object and check one
by one.

Option1 is not recommended, however.

Hope this helps.


Regards
Datta
Hi,
The master data is not getting deleted, because it is certainly used somewhere.
When you get an error, please get to the details of the errors.
Then you will be able to take corrective action.

We too had this issue, and we started deleting a group of products, though it was a time
consulting activity.

Error log checking is important.

I hope this gives you insight.

Please check and confirm.


Regards
Datta

Thanks for the help

The message show as below

No master data was deleted


Message no. RSDMD129

Diagnosis
No master data was deleted. Either because it was all being used, or because the user did not
want to delete upon request.

Procedure
First delete the master data in the places, in which it is still being used:

1. In the InfoCube
2. As an attribute
3. In a hierarchy
Then try deleting again.

Please suggest how to search the same.


I will re- check,

Thanks a lot
The master data is linked to some nine info cubes.
The error log showed the problem .Now i am doing selective deletion from all the cubes to get
rid of this junk data.
Radhakrishnan
Maintaining the Business System Group
Use
In this IMG activity, you determine the assignment to a business system group of the APO System and
the SAP R/3 System to be connected. By doing this, you create areas with the same naming conventions.
This guarantees that the same names are used for master data, and that they are synchronized in
distributed system landscapes.
Prerequisites
As an independent logical system, the APO System must, in addition to the SAP R/3 Systems, be assigned
to a business system group (BSG).
Procedure
...
1. In Customizing for mySAP SCM – Implementation Guide, choose Inventory Collaboration
Hub  Integration of SAP SCM and SAP R/3  Basic Settings for Creating the System
Landscape Maintain Business System Group.
2. Choose New Entries.
The New Entries: Overview screen appears.
3. Specify the following information for the business system group:
 An alphanumeric key (maximum of 10 characters)
 A description
4. Save your entries.
Example 1 of 2
An APO System is to be linked with two SAP R/3 Systems (A and B), in which two different materials (for
example a hammer and a screw), have the same material number (100). Both materials are to be
represented as two different products in the APO System.
Assign both SAP R/3 Systems to different BSGs. Assign the APO System to one of the BSGs.
SAP R/3 System A (Material number 100 = Hammer) -> BSG A
SAP R/3 System A (Material number 100 = Screw) -> BSG B
APO System -> BSG A
In order to avoid having two identical names and to be able to uniquely assign the material numbers,
you need SAP extension APOCF005. This is an inbound processing product (transaction SMOD or CMOD).
Material number 100 from BSG A receives, for example, product number 100 in the APO System and
material number 100 from BSG B gets product number 100_B. This allows you to uniquely assign the
materials.
For inbound processing in the APO System, the following SAP enhancements are available as customer
exits for master data:
 APOCF001 : Inbound processing location
 APOCF005 : Inbound processing product
 APOCF008 : Inbound processing resource
 APOCF0012 : Inbound processing production process model
Example 2 of 2
An APO System is to be linked with two SAP R/3 Systems (A and B). In SAP R/3 System A, a particular
screw has material number 110. In SAP R/3 System B, it has material number 120. Both material
numbers are represented in the APO System by one product, with product number 110.
Assign both SAP R/3 Systems to different BSGs. If possible, assign the APO System to the BSG
whose data does not have to be renamed.
SAP R/3 System A (material number 110 = screw) -> BSG A
SAP R/3 System B (material number 120 = screw) -> BSG B
APO System (product number 110 = screw) -> BSG A
You use SAP enhancement APOCF005 to convert the local SAP R/3 material number 120, to the APO
product number, 110.

For the time being, this second scenario is only supported for material masters and product
masters. If, for example, the same customer is used with different BSGs in R/3 Systems A and B,
you must create two separate customer locations in the APO System. This applies for the vendor,
plant, and other master data.
There is further information in the application component SAP Advanced Planner and Optimizer
(SAP APO) under Master Data General.

can any one share with me any two critical


issues in DP support project
Hello friends,
I am new to DP support project in SCM5.0 version, can any one share with me my queries
1.what is my support rolles and responsiblities .in DP
2.can any one share with me any two critical issues in DP and how can you resole it

Hi Sivaram,

1. As a DP functional consultant you need to resolve all the incidents raised in APO DP module(
Forecasting, lifecycle planning, master data in APO DP, Planning books, macros , release to
APO SNP or R/3 or other systems, CVC maintenance etc) which will also include BW part of
SCM -APO related to DP, like data extraction and back up into and from infocubes. You will
also support all the interfaces realted to APO DP , for the APO related part of the interface.

You need to reoslve these incidents in the time frames defined by SLAs for the support project.
You will have to also carry out problem management of some of the critical incidents so that
incidents do not re-occur.

Further you will have to support Release /Change management support (if it is part of your
contract of Support project).
This will involve testing changes in quality system, before these are moved to production system.

2 I found 2 critical issues in APO DP support as below:


2.1 Sales history extraction data interfaces from BW system used to have problems: technical
problems of interface / jobs in BW running late, functional problems: sales history extract in BW
not correct because various reasons
This used to lead to delayed job runs in APO and finally delayed forecasting and ptehr DP
processes.
By proactive monitoring by BW team and taking appropriate actions for master data issues,
this issue occurrence frequency was reduced, but not completely eliminated.
2.2 Another major issue was related to back up process of APO DP planning data into infocubes.
Finally by splitting the jobs as per daily, weekly and monthly bakc up procedure , we were able
to resolve the issue.

I hope I have answered your queries.

Hi Sivaram,

Below are the points that I would like to share with you

1) Support roles and resp in DP

a) Data consistency during demand planning


b) Data backups
c) Running Inconsistency checks
d) Day today monitor of jobs
e) Day today monitor of process chains
f) Support for DP end users

2) Two critical issues

a) Issue:- Process chain failure


Resolution:- Identify the logs and take corrective action

b) Issue:- Data not matching between planning area & cube


Resolution:- Inconsistency checks

Regards
R. Senthil Mareeswaran.

Hi Senthil,
a) Issue:- Process chain failure
Resolution: - Identify the logs and take corrective action
Why the process chain failure pl mention the same reasons as per your
observations?

b) Issue:- Data not matching between planning area & cube


Resolution:- Inconsistency checks
Only for inconsistency checks or some more reasons pl mention for me

Thanking you
sivaram

Hi
The 2 most critical issues in DP support which i have faced as as follows:

1. Changing market situations/ economic situations compels business to continuously


change their DP processes, in order to achieve highest forecast accuracy. Implementing
these continuous changes makes the process complex and we lose the track of past
process and new process becomes cumbersome.

2. 2nd critical issue I find is to restrict the pollution of DP environment with non-relevant
master data creation (CVCs). This becomes the major cause of the problem mentioned in
thread above i.e. data mismatch in Planning book and Cubes.

Warm Regards,
Rahul Mohnot

Hi Sivaram,

Apart from what has been mentioned above,

tickets on aggregation & disaggregation are raised quite often, so you should be aware of
the keyfgiure disaggregation settings.

Many a times data mismatch in cube & planning area is due to cvc not present or cvc not
maintained correctly.

Tickets are raised by user stating statistical forecast is not proper mostly in seasonal
model. This is due to improper sales history.

Regards,
Chetana

support issues DP
hi experts,

can anyone share two support APO DP issues. Any one macros or realignment or life cycle
planning or process chains.

1. how the problem is happened.


2. how did you resolve it. what r the steps to take prevent in future.

thank you for you valuable time to spending on this. please answer asap.

regards,
Hi Sai,

Here is the details of the second point mentioned by me : Data mismatch between two planning
books for same key figure

Issue: The user was facing an issue where for one key figure he was seeing different values in
two planning books.

Analysis: During the Analysis we found that the Key figure is a calculated Key figure based on a
macro. In one planning book the Key figure was a default macro whereas in the other it was a
directly executable macro or no macro existed for that Key figure. For eg: Consider PB1 and KF
3 which is calculated through a macro as KF1+KF2. The same Key figure KF3 is part of
another planning book PB2. Now when the user had updated the value for KF1 and KF 2 in
planning book PB1 and press enters he would be able to see the updated value for KF3 in PB1,
but if he goes to PB2 he will not be able to see the updated values for KF3 unless he saves the
planning book PB1 after the changes. The issue mainly happened here as the values were not
directly entered by the client for KF1 and KF2 in planning books but were uploaded through a
cube to planning area

Reason: The value doesn’t get updated to the live cache if the values are not saved. Live cache
values are the ones which get displayed in the planning books.

Solution : We had first provided the training to the client and have told them regarding the
different macro types and also made the macro as part of the background job so that every day
whatever changes has happened gets saved to the live cache and values gets updated in all the
planning books.

Thanks,
Diana

Hi Sainath,

You need to be more clear on this.

Rgds
Sourabh

dear sourabh,
thanks for your reply.

my question is

can you share any two APO DP support project diffcult issues.
how the problem raised and how did you resolved it. from-starting to end.
reply me asap . it would be very helpful for me plz
thanks.
ch.sainath

Dear Sai

Issues vary as per the business requirements and as per the configurations implemented in
a particular project. Couple of simple issues which were faced by our end user is as
below

> Area executive was unable to put estimates in the planning book. This was due to the
authorization not available for that particular Selection ID. Then action was taken
accordingly by providing the authorization.

> Aggregation / Disaggregation results are not as expected. Then the data is thoroughly
analyzed and will be found out due to which reasons these things happen and will be
corrected manually.

> Realignments happen when there is a change in structure of a Product / Customer.

> Other issue are also like background jobs getting cancelled / errors in background jobs.
These can be solved by analyzing the logs generated.

Regards

Raghavendra Emani

dear ragh,
thanks for your reply.

you said one point

1. Aggregation / Disaggregation results are not as expected. Then the data is


thoroughly analyzed and will be found out due to which reasons these things
happen and will be corrected manually.

Question:

1. Why aggregation results not updated in planning book and what is the root
cause for this. How did you analyze it? What you have done to prevent this in
future...

Please share with me an example that would be great...

I appreciate for your time and response.


thanks
ch. sainath.

Hi Sainath,

APO DP issues can vary based on the client structure and implementation done.

Some of the issues are mentioned below.

1. Data mismatch between cube and the planning area. This may be due to Realignments
or due to data not flowing from R/3 to APO or due to Process chain not updating the DTP
due to Failures.
2. Macro related issues can be like seeing different values in different planning books.
This may be due to default macro in one planning book and not in another. This is
because the values shown in data in planning book is not saved to live cache.

Kindly let us know if you are looking for a particular issue.

Thanks,
Diana

Thanks for your valuable reply.

as you said 1st point data mismatch between cube and planning area. Could you
plz explain me in detail like taking any example of scenario

Question
1. Why data mismatched from info cube to PA , because of realignment what
you have done in realignment how it effects to the data.

2. Data not flowing from R/3 to APO why where did you find the problem.

And coming second point

1. Why the macro values not saved in live cache, reason for it??? And what
solution did you suggest.

I’m very close to the what I want waiting for you reply. To close the thread.

Regards,
ch. sainath.

Hi Sai,
Here is the details of the second point mentioned by me : Data mismatch between
two planning books for same key figure

Issue: The user was facing an issue wheren for one keyfigure he was seeing a
different values in two planning books.

Analysis: During the Ananlysis we found that the Keyfigure is a calculated


Keyfigure based on a macro. In one planning book the Keyfigure was a default
macro whereas in the other it was a directly executable macro or no macro existed
for that Keyfigure. For eg: Consider PB1 and KF 3 which is calculated through a
macro as KF1+KF2. The same Keyfigure KF3 is part of another planning book
PB2. Now when the user had updated the value for KF1 and KF 2 in planning
book PB1 and press enters he would be able to see the updated value for KF3 in
PB1, but if he goes to PB2 he will not be able to see the updated values for KF3
unless he saves the planning book PB1 after the changes. The issue mainly
happened here as the values were not directly enterd by the client for KF1 and
KF2 in planning books but were uploaded through a cube to planning area

Reason : The values doesnt get updated to the live cache if the values are not
saved. Live cache values are the ones which gets displayed in the planning books.

Solution : We had first provided the training to the client and have told them
regarding the different macro types and also made the macro as part of the
background job so that every day whatever changes has happened gets saved to
the live cache and values gets updated in all the planning books.

Thanks,
Diana
Thank you for your valuable time spending on this. it will be very help
full. . Have a nice day.

Regards.
ch.,sainath

sap apo dp
Hi

Please tell the biggest issues handled while implementing tha SAP APO DP.Tell mee the
challenges handled during implementation.
Thanks
Prasanna
Hi Prasanna,

Some of the few typical DP implementation issues & the challenges handled are as follows:-

1) Choosing the right forecast method: - The effective way of handling will be to analyze the
past historical data in terms of business requirement and strategically ways and then to arrive at
the suitable method which supports trend, promotion, lifecycle planning

2) Accuracy of Statistical Forecast:- This issue can be handled via effective cleaning of past
historical data which are having one-time orders, obsolete products, products which are hold by
marketing, etc.,

3) Change management: - Getting the users to start forecasting, end user training is a big
challenge which can be done via appointing a Change management person and change control
strategy in place

4) Complexity of forecasting: - This can be avoided by choosing only relevant products to send
and maintain in APO and not all products to be planned in Apo

5) Complexity in configuring:- This can be handled by choosing SAP standard objects like
planning area, planning book, profiles etc., which will save time, efforts, resources, easy
implementation, avoid later-on issues etc.,

Regards
R. Senthil Mareeswaran.

Real time Characteristic Value Combination


(APO DP) Validation with ECC
A. What is CVC?
CVC, Characteristic Value Combination is a master data of APO Demand Planning. It is group of
characteristic values (hence known as combination) which is used in forecasting process. In
CVC, we define with what characteristics (values) forecasting can be done. To learn more about
CVC please check - http://goo.gl/qjkwB

B. Steps to create CVC


CVCs can be created in transaction /n/SAPAPO/MC62 using following different options:-
 Create Single Characteristic Combination
 Create Characteristic Combination --> Create Manually
 Create Characteristic Combination --> Load to Worklist w/ data source Planning object Structure
 Create Characteristic Combination --> Load to Worklist w/ data source InfoProvider
 Create Characteristic Combination --> Load to Worklist w/ data source File
 Create Characteristic Combination --> Load to Worklist w/ data source Business Add-In
 Create Characteristic Combination --> Generate Immediately w/ data source Planning object
Structure
 Create Characteristic Combination --> Generate Immediately w/ data source InfoProvider
 Create Characteristic Combination --> Generate Immediately w/ data source File
 Create Characteristic Combination --> Generate Immediately w/ data source Business Add-In
 Create Characteristic Combination --> Generate in Background w/ data source Planning object
Structure
 Create Characteristic Combination --> Generate in Background w/ data source InfoProvider
 Create Characteristic Combination --> Generate in Background w/ data source File
 Create Characteristic Combination --> Generate in Background w/ data source Business Add-In

C. Problem faced by our client?


 APO Demand Planning will not worry about the real integration with ECC. So irrespective of
whether CVC value is valid or invalid, it will anyway create CVC combination. Our client wanted
to do real time check with R/3 and that CVC should be created only if valid characteristic values
are used to create CVCs.
 Also, this check should be done irrespective of the method used for creating CVC.

D. What options we tried...


 Whenever it comes to CVC validation, BAdi /SAPAPO/SDP_MASTER Method
CHECK_MANUAL_INPUT is the first option which comes to mind. But the issue with this option
was that this method was getting called only for Create Single Characteristic Combination
method of creating CVC. So we had to rule this out.
 Next after some analysis, we thought creating an implicit enhancement at start of class method
/SAPAPO/CL_SCMB_PSTRU_PLOB=>CREATE_PLOBS may be helpful. After some debugging we
were able to confirm that this point gets called in all the methods of creating CVC and thus may
be best point to add validation. But later we found that, class didn't allow CVC table parameter
to be modified. Hence we were not able to remove invalid CVC combination from CVC internal
table to prevent it from creation. Hence even this option was out.

E. Final solution!!
Some more (actually lot more…) analysis and debugging took us back to BAdi
/SAPAPO/SDP_MASTER but this time method COMBI_ENRICH. Documentation of this BAdi
method suggests that this can be use to enhance CVC values just before creation. So this is the
place where we can modify CVC values as required. This method also gets called in all the above
ways of creating CVCs. Thus we decided to use BAdi /SAPAPO/SDP_MASTER~COMBI_ENRICH.
Technical pseudo logic is discussed in section below and gives clear picture on how we used this
BAdi method for our validation scenario.

F. Technical Pseudo Logic


1. Create implementation for BAdi /SAPAPO/SDP_MASTER method COMBI_ENRICH.
2. To log error message for invalid CVC to application log, we will need application log handle.
When control comes to this point use function module /SAPAPO/TS_GENER_CODING by passing
planning object and program class 'PSTRU_PLOB_CREATE' to get internal generated program
name for respective planning object. Variable ‘(Generated program name)GV_LOGHANDLE’
from ABAP memory will give handle of application log currently being used.
3. All CVCs are available in table parameter CT_PLOB_VALUES_TAB of this Badi method. Pass this
CVC to RFC function module in R/3 for real time validation and get back the error message.
4. If error message is returned for respective CVC, log this error message in application log using
handle reteived from ABAP memory, as explained in point 2 above. Also delete respective CVCs
from table parameter CT_PLOB_VALUES_TAB so that same is not created.
5. From performance point of view, it is advisable to pass all CVCs in single RFC call to R/3, get the
validation done there and let error message be send back to APO.
6. We also store the CVC context when logging error message to application log. This helps in
understanding which CVC lead to respective error. Check below screen shot to get an idea how
error message will appear in application log.
G. Conclusion
I hope this helps anyone who have similar requirement on CVCs validation with ECC.

Hi Pankaj,
Splendid writing. I appreciate the way you presented the problem, explained your
analysis, and concluding with a solution providing sufficient rationale and evidence.
Cheers,
Rajesh

APO DP Issues during Post Go Live


(Intensive Care) support
Dear Friends,
We are going to Go Live very soon. Can anybody tell me what are the common issues we face in
APO Demand Planning and what causes the issue?
I am working at client Location. Your help will be much appreciated.
Thanks,
Kishore

Hi Kishore,
Is it a first time deployment or enhancement of existing application?
Anyway, prepare a solid cut over plan and a failure mode and effects analysis (FMEA) kind of
report highlighting integration touch points and possible failures.
Few factors you should consider for Post Go-Live hyper care are:
1. If the DP planning area already exists in production and you are moving the PA related
changes, be sure to take backup of your planning area and deactivate it before releasing the TR.
Otherwise, TR transfer will be failed.
2. There might be security issues i.e. actual user ids have not been updated or mapped with
appropriate profiles.
3. With respect to macros, after go live, better to verify the activation status. Sometimes, they
might be deactivated and requires manual activation (quite rare though).
4. Verify the period population of time bucket profiles. They should have been updated
appropriately.
5. Create time series objects for sufficient period for planning area
6. Execute consistency checks (t.code /SAPAPO/TSCONS) to make sure your planning book
display the data without any error.
7. You have to create DP master data i.e. CVC before populating your planning area with data
and validate the application log. There should not be any red flag.
8. Validate your data transfer - verify the data transfer logs and data itself in the planning book
9. There might be an issue with scheduled jobs. Verify the batch job definitions, existence of the
required forecast profiles, job variants, and selection profiles created in production as per your
checklist without any typo and authorization issue.
10. DP execution - Results may not be as expected - verify the macros and procedure. Check the
user who executed has understood the process and had finished required training.
11. DP output release - Check the logs and resolve any issues reported in the log
Thanks,
Rajesh

Issue with Displaying BUOM of


Products(aggregated level) at Interactive
planning in SCM APO DP 7.0
Hello SCM Experts,

I have an issue with Displaying BUOM of Products aggregated level at Interactive planning in
SCM APO DP 7.0 Ehp2

We have PC as a BUOM at Planning area and KG as BUOM at MAT1 for all Products.
So, when we load single Product at interactive planning BUOM reading / displaying correctly as
KG… but when you load bunch of Products to interactive planning which have same BUOM
(KG) reading /displaying the Planning area BUOM (“PC” )

Need help on this ASAP and Quick response appreciated

Thanks in Advance.!!

Ravi Kumar
Helpful Answer
That is standard functionality as what if one product does not have UOM KG, then what?

Are you able to manually change the UOM in the PB afterward aggregating? I haven't
tried it but may work.

Hello

As i stated in the post..

if products have same BUOM (KG) then it should display KG at interactive


planning at agg level (standard functionality - we have maintained all AUOM for
the products) but it is not working as expected

Coming to your point..if Product have a different BUOM (CON) then it should
display CON as BUOM at interactive planning which is working fine.
Please let me know.is it clarified your query..And please let me know if you have
solution for my requirement

Thanks
Ravi Kumar )
Hi Ravi,

You need to implement the BADI "/SAPAPO/SDP_UOM_CONV" to


meet your requirement. Pl check the following SAP note for details.
Note 1408692 - BAdI to improve Unit of Measure conversion

Thanks,
Rajesh
Posts Tagged ‘APO alerts’
However there are many problems that are created by the Integrators/Consultants including our
“esteemed” blog writer due to their lack of knowledge of both demand forecasting as well as an
understanding of the tool. If they don’t understand the tool, they should not be implementing it
or training the users. The baggage they leave behind creates a mess that makes the software
worthless and unusable. These problems include:
1. Implement APO to be used as a typing tool and tell the users Statistics are terrible so not
worth using it.
2. Disable statistical modeling under the pretext of security. In reality, the consultants are
worried about fielding questions from the users on statistics that they do not understand
themselves.
3. Enabling options and parameters in the tool with out any idea of what they do to the resulting
forecasts.
4. Deciding on important things such as forecast aggregations, forecasting levels and exception
management without any process discovery or user input.
5. Finally not project planning the budget to include training to the users particularly on the Stats
and Demand Planning.
Perhaps companies should divide the implementation plan into two parts – process design that
includes the provision for model tuning and training and the second part that involves system
design and making the system to work. The system design should follow the process design.
The model tuning should be done by the same consultant that is responsible for process design.
And definitely that person should be an Expert not only in the tool but also in best practice
demand planning along with expert skills in training the planners.
Given how many implementations are currently plagued with tool problems aggravated by
Consultant inefficiency and incompetence, perhaps more companies should think about re-
implementations of APO DP. Just throw away the old concepts and practices and start thinking
about how to fix the problems and make the tool more usable.
Finally SAP also needs to wake up and start fixing the problems in their most popular SCM
module namely APO Demand Planning. It needs to fix the error calculations and the alert logic.
More on that in a separate blog entry.
To set the record straight with this “esteemed” writer so he does not pollute the waters and
mislead many planners, I would say the following:
The blog writer concludes that SAP APO calculates the MAPE incorrectly. I agree with him to
a certain extent. SAP purports to compute MAPE using the academic definition of averaging
percentages but it does not do this either. It goes into a hole when the actual demand is zero and
makes the MAPE metric unusable. However, the other metrics namely MAD and RMSE are
correct.
I strongly disagree with this Consultant/writer when he concludes that best fit models are
erroneous because the error calculations are defective. If you poke around the underlying
mechanics which is well documented in the APO online manual, you will know that the
optimization is done using the Mean Absolute Deviation, which is superior to the MAPE; Mape
is a percentage and has some awkward properties.
The Automodel selection 1 uses the MAD and picks only smoothing models while the
Automodel selection 2 also claims to include linear regression models. But in practice
Automodel selection 2 produces inferior results and expects the user to baby sit the modeling by
feeding manual parameters!! In general Automodel is not for the faint of heart as many settings
have to be correctly configured. Given the fact most configurations are done by junior
consultants from big 5 consulting firms, you can guarantee that this is an unrealistic assumption.
Even with other forecasting packages we do not recommend best fit models or expert selection
as the final model. They are good starting points, but the planner has to do more in getting to the
right model and the demand forecast. They don’t have to be expert statisticians but they need to
understand their business and have a preliminary understanding of what various models do.
Yes the Statistical models are straight forward in APO, in fact, they are basic. There is no
complexity in them. They are not claiming to do Box-Jenkins or Transfer functions or ARMAX
or any other models with esoteric names. However, I have found people use the MLR models
very cleverly combined with forecast attributes. So it is all in implementation, model tuning and
finally imparting that much elusive knowledge to the planners and find a way to sustain that
knowledge.
My two cents = Do what you can and understand what you cannot. Some intellectual honesty
will also go a long way!
Happy Labor Day weekend!
Tags: APO alerts, APO DP, Automatic models in APO, Automodel 1, demand metrics, Forecast
modeling, Forecasting Alerts, SAP APO
Posted in Demand Planning, Forecast Modeling, forecasting software, Sales Forecasting, SAP
APO | Comments Off
Is Statistical Modeling an After thought?
Wednesday, March 21st, 2012
I just had a conversation with a Fortune 500 executive recently. He mentioned his company is
spending tens of millions of dollars currently upgrading from SAP APO 4.0 to SAP SCM 7.0
Demand Planning.
Come to think of it, what are the big differences between the 4.0 or 5.0 vs. 7.0? There are some
marginal improvements that the tech shop may admire but anything for the planning
community?!
Then we also hear that the planners have not been using the Statistical modeling feature in APO.
Will upgrading to 7.0 persuade the planners to use the Stat Models more? Not just more, just
even barely? Then I hear a pause and the IT consultant says that Stat models are not a priority
given the budget constraints they have.
So more millions before and no stat models. Now five years later, we have a shiny new upgrade
and again the Stats are not a priority.
I have been preaching Usability for the past few years.
Put together fine tools – But help the users in making the transition to the tool – give them
better understanding – Make the new tool more usable!
Give them the reports they need. Provide them an exception based workflow!
APO has good statistical models. They will help you move the peanut forward but only if they
are understood and leveraged.
We just re-launched the marketing campaign for our Usability Consulting. Model tuning
and model matching to product profiles are important elements of the Usability training.
Once implemented the Usability project will harmonize the use of models across planners from
various geographies for the same business/product family. There will be streamlined work flow.
We help you answer the following questions:
1. Am I using a Pareto Approach in my APO planning process?
2. How can I leverage APO DP to improve our forecast accuracy?
3. Why does APO mostly give me flat forecasts? How do I fix this?
4. What are Alpha, Beta, Gamma, Sigma and Theta? How do I leverage these parameters?
5. What is the correct level to model so as to improve the overall accuracy at the SKU level?
6. What are weighting profiles? How does it affect my final forecast?
7. How can I control time trend using trend dampening profiles?
8. Are there products and customers that are better left to APO’s automated modeling strategy?
9. Which models to choose for what family of SKUs?
10. What are custom modeling profiles?
11. How is APO helping us simplify and improve the promotional planning process?
12. How do I create Multiple Linear Regression Models in APO?
13. Are we using the system defined error metrics in APO? Why are they different from the classic
MAPE calculations?
14. How do you conduct phase-in/phase-out of products?
15. When should I not use the Croston’s Model?
16. Why am I getting 9,000+ alerts every morning?

sap apo dp process chains


Hi guys

i know that process chain is basically a background job but i need to know exactly what is
process chain and why we use it on support project thanks

Hi Ganesh,
Process chains are some sequence of processes that are run in the background after a
particular event
Is triggered. We give start time and end condition for these processes to run.
These process chains help us in scheduling daily background jobs/processes so help us in
scheduling the complex schedules in SCM.

More details you can find in the link


http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/40c71c50-4601-2b10-
b9b7-a808ff2f3f2b?QuickLink=index&overridelayout=true&28385438947457
Hope this helps you in clarifying your doubts.

Thanks & Regards,


Anurag

thanks anurag that was a helpful answer but do you have a guide how to setup the process
chain and how to monitor jobs

Thanks
Anurag

ya anurag but that link does not show anything regarding the process chain setup

Hello Ganesh,
The following is the only SAP help link which describe about the process chain
which I could get for your need Hope this helps you a bit
http://help.sap.com/saphelp_scm50/helpdata/en/8f/c08b3baaa59649e10000000a1
1402f/frameset.ht
Process Chain
Definition
A process chain is a sequence of processes that are scheduled to wait in the background for an event.
Some of these processes trigger a separate event that can, in turn, start other processes.
Use
In an operating BI system there are a multitude of processes that occur regularly. If you use process
chains, you can:
 Automate the complex schedules in BW with the help of the event-controlled processing,
 Visualize the processes by using network graphics, and
 Centrally control and monitor the processes.
Fundamental principals of the process chain concept are:
 Openness
The abstract meaning of a process as any process with a defined beginning and end enables
openness with regard to the type of process that can be integrated into a process chain. The
principle of openness is applied to the theory behind process chains, in that both user-defined
programs and processes can be implemented. In addition, you can include process chains in other
process chains, known as meta chains. In doing so you are able to integrate process chains from
the system in which the meta chain is found, or from other systems. In this context, we are
talking about local or remote process chains.
 Security
Using process chains offers a high amount of process security, which is based on the principals
of background management:
 Processes are scheduled before they run and can be monitored with the standard batch
monitor.
See also: Process Chain Log Display
 Background events start subsequent processes.
 Short dumps and terminations are recognized and handled respectively.
 Flexibility
The subsequent process must get all the information it needs for a correct run from its
predecessors. This allows new process types to be integrated without the existing types having to
be adjusted.
Structure
A process chain consists of a start process, individual application processes and the collection processes.
Define the start of your process chain with the start process. All other chain processes are scheduled to
wait for an event.
The application processes are the actual processes. BI supports process types of the following
categories:
 General services
 Load process and post processing processes
 Data target administration processes,
 Reporting Agent processes
 Other BI processes,
as well as processes that you have implemented.
If they are used in other SAP applications, you have further categories available, if applicable.
Collection processes are treated differently by the process chain management. They allow multiple
chain strings to be combined to form one individual string. This allows them to replace multi-field
scheduling of the actual work processes.
Processes are connected using events that start a successor process after being triggered by a
predecessor process.
Integration
A process chain is a BI object with a transport connection and a connection to the BI document
management.
Automatisms
If you use process chains, the automatisms of the integrated processes (for example, update PSA data in
the data target, or activate data in the DataStore object) are ignored and you must implement them
using the process chain. If you schedule a specific process in a chain, you support the automatic
insertion of additional, relevant standard processes with the consideration of such automatisms.
If you use data transfer processes, the automatisms from InfoPackages are no longer available and you
must implement them using process types.

Third party job scheduler for APO


background jobs
Hi All,

I would like to know the pros and cons for triggering background jobs in APO systems using a
third party software. The project which I am working on has numerous jobs in DP/SNP/PPDS
and gATP.

Any help, information in this regard would be highly appreciated.

Warm regds,
Ashutosh,

In a pure SAP landscape I don't vote for a third party scheduler as probably the Return on
Investment is not justified. In a complex multisystem, multiERP landscape this could be a
good choice.

Refer to:

Re: External tools for monitoring job scheduling

This would give you a good insight. My personal experience says that Cronacle is a good
choice.

Hope this helps.

Hi Ashutosh,

Using 3rd party job scheduler has its own advantages and disadvantages
From my experience in my previous engagement.

Advantages:-

1) Challenges in monitoring jobs in terms of efforts, resources can be


Reduced.

Disadvantages:-

1) Chances of more communication gaps which leads to delay in


Retriggering

2) Delay in corrective and preventive actions for job failures

3) In case of parallel jobs & job sequencing, knowledge is required


With 3rd party scheduler or else it will be not effective

4) Communication delays in case of jobs not started or failed

5) If a process chain delayed, there will be lot of overlapping with


other jobs leading to performance issues.

Hence it is advisable to maximum try using own resources rather


than 3rd party scheduling.

Regards,
R. Senthil Mareeswaran.
Hi,
I had used Maestro in couple of the projects.
Major pros and cons are like covered in above replies. One major advantage was
cetralized monitoring across many SAP systems.

Now I do not advise any third party tool, because with SAP Solution Manager
Enterprise edition, RunSAP methodology, End to end solution Operations are
being taken care which includes Job monitoring, busienss process monitoring etc
across SAP and non SAP systems landscapes.

Releasing the Demand Plan to SNP


Dears,

I need more illustration when you using Transaction Code /SAPAPO/MC90 to release demand to
SNP

APO Help Say :-

"independent requirements is not restricted to planning in SCM SNP. Planned independent


requirements within the production horizon are planned in production planning and detailed
scheduling in SCM PP/DS."

What he means by data will be available in PP/DS?

Did APO System Generate Capacity Requirement? is that what he means ?

Regards
Rami

Rami

There are two horizons in the product master - SNP production horizon and SNP stock
transfer horizon. In your company if you are using both SNP and PP/DS then after the
forecast is released to SNP, you will execute SNP and then for the short term you will
execute PP/DS.

In such a scenario (where SNP is used for mid term planning and pp/ds for the short
term) , you need to maintain the SNP production horizon (in case of inhouse production)
and stock trans horizon(for external procurement / branch to branch transfers) to define
the boundary between SNP and PP/Ds. that is you need to specify till what day /week
/month you need PP/DDS plan and for what time frame you need SNP plan. The SNP
production / stk trans horizon is the horizon during which SNP will not generate any plan.
That is if your stk trans horizon is say 10 days , then SNP will not genearate any supply
within this 10 days even if there is demand . Supply will be generated for the next 10
days only by PP/DS. SNP will generate supply only outside the production /stock
transfer horizon.

As specified by SAP , your planned independent requirement is thus not restricted to


SNP. That is if the demand is within the prod / stk trans horizon it is fulfilled by the pp/ds
run and demands outside the horizon are planned by sNP

Thanks
Saradha

Hi
With Transaction Code /SAPAPO/MC90 to release demand to SNP, you can release the
FCST from from APO DP to SNP for planning. You can execute the plan either SNP
heuristic , CTM or optimizer.
As out put of SNP planning can be use as a basis for making sourcing, deployment, and
transportation decisions. So this is rough cut mid-term or long term capacity planning. In
APO , we can also do the rough cut capacity planning The actual production is then
planned in PP/DS.

I hope this help you to clear your doubt

Amol

dear

Many people have explained technically. Practically see the example below

Our forecast horizon is 120 days. i.e. When I plan in Jan 2011, Feb, march, April, May
forecast will be entered by sales in DP
The SNP horizon) asdata will be transfer to SNP (120 days data independent
requirement and production and STO qty is planned in SNP.
Then for Feb data (production planning horizon) is transfer to PPDS for further
processing for finite scheduling to get the correct capacity evaluation and exact dates of
production start and end.

Hope this will help to understand the issue from practical point of view,

Dear Raj,

Many Thanks for your Explanation. I really Appreciate your Effort


Transfer from DP to R/3
Hi Friends,
I am working on SCM 5.0 & ECC 6.0.
I want to transfer Forecast to R/3 as PIR.
Is this through CIF or creation of transfer profile & Publication settings are enough.
Can somebody pls tell steps wise process for transfer DP to R/3.
I have created Transfer Profile & did Publication settings and then execute the job.
But I didnt get it in R/3 MD63.

Pls help its urgent.


Points for sure.

there is no need of the any intergration model to transfer forcast as PIR to the ECC.
There is standard trx code for this.
/SAPAPO/MC90 - Release Demand Planning to Supply Network Planning

only thing imp in this transfer is, whatever the CVC's you are having in APO. Like product,
location needs to be in the R/3 system.

thats all.
hi vishal,
you need to create integration model for the product location combinations you want to
release the forecast for. this u do in r/3 with cfm1 and cfm2 t-codes let me know if your
question is answered.
regards,
kiran

forgot to tell vishal its thru CIF only u transfer demand from dp to r/3

Dear Kiran,
You are doing some mistake.
he is asking to transfer the forecast from APO to R/3 which cannot be handled thr
integration model.
There is separate transaction code for this.
You can transfer PIR from R/3 to APO with the help of integration model

Let me know if some confusion. or may be i am confused....:)

pravin

DP Time series to SNP order series transfer- /sapapo/mc90


APO Order series to R3 - Publications settings
R3 to APO transaction data - CIF

Interestingly as kiran mentioned, you need a CIF for the product locations.
Though this is not mentioned you might need to set this up to make a link
between APO and R/3 (i believe APO checks the integration model to validate the
location products for transfer of PIRs)

dear harish, i dont think it need the CIF,


and what for it should check??
suppose you r using the only DP?? why there is need for the IM. u send the history thr BW.
so i dont think there should be need of prod/loc IM to transfer the forcast to the R/3.
what do u think? is my logic correct or missing something.

definetly APO will validate the product and location, as should be a master data in R/3.
ofcurse u can not send the PIR for the material which is not in R/3 (though this can be a
situation, since only CVC is the master data in DP, it wont have any check with the r/3 weather
these exit that kind of releationship or not)
Regards,

Thanx all,
but is it means , Transfer of Forecast only through SNP Planning Area.
Again shld I have Product Master maintained for same ?

In short, I cant transfer directly through DP planning area to R/3.

Hi Vishal ... pls find the process for releasing forecast from APO to R/3 as PIRs

<u><b>Pre-Requisites</b></u>:
• Product Masters are defined
• Location Masters are defined
• Distribution Types are defined.
(mySAP SCM - Implementation Guide -> Integration with SAP Components-> Integration of
SAP SCM and SAP R/3 -> Basic Settings for Data Transfer -> Publication -> Maintain
Distribution Definition or Generate and Delete Distribution Definition)

Then you define your transfer profiles:


SAP menu-> Advanced Planning and Optimization-> Demand Planning -> Environment ->
/SAPAPO/MC8U - Maintain Transfer Profiles
Then you schedule it in DP background jobs.

Please note:

Integration Models -> used for transfering data from R/3 to APO
Publication / Distribution -> used for transfering data from APO to any other logical systems.

<b>You can definitely transfer data from DP planning area to R/3.</b>

btw also check the APO and R3 queues for any transactions that are stuck

You dont need product masters


in the transfer profile fill the values for
Characteristic - product and Characteristic - location

The InfoObject name of the characteristic that represents products in the planning area. (If you
leave this field blank, the system reads the data for the characteristic 9AMATNR and
9ALOCNO )

hope you have checked this part of the help


http://help.sap.com/saphelp_scm50/helpdata/en/c9/b4df8870c011d398450000e8a49608/content.
htm

I had faced a similar issue sometime back. i remember setting it right by setting a CIF model. I
suppose you can try it out for one Location Product (or whatever characterisitc equivalent in DP
you have) and see if it works. but if others have used it without a model then maybe it is so.

Hello Vishal -

Look at the following excerpt on help.sap.com

Transfer of the Demand Plan to R/3 Demand Management


Purpose
This process transfers the demand plan to Demand Management in SAP R/3, and creates
planned independent requirements there.

Prerequisites
· You have installed the correct Plug-In version.

· You have set up your products in SAP APO by choosing Navigation ® Master Data
® Product from the SAP APO menu tree.

· You have set up your locations in SAP APO by choosing Navigation ® Master
Data ® Location from the SAP APO menu tree.

· You have defined the split of product demand to locations in one of two possible
ways. For more information, see Location Split.

· If necessary, you have defined a product split in SAP APO. For more information,
see Product Split.
ssss· You have created a data view for this task in SAP APO. The use of a separate
data view to transfer a demand plan to SAP R/3 has performance benefits. The data view
contains the following:

¡ A future planning horizon only (no historical horizon unless you want to transfer
the historical horizon too)

¡ A planning buckets profile with one periodicity only (if it contains more than one
periodicity, the job is aborted with an error)

¡ Only the key figure(s) that you want to transfer to SAP R/3

¡ No actual rows (unless you want to transfer them to SAP R/3)

· You have set the planning strategy of each material in SAP R/3.

Hope this helps.

Regards,
Suresh Garg

Dear All,
I am getting very much confused with this thread.
I never worked on DP, from last 9 months i am working on DP. and last month i
got APO certified.
As per my bookish knowledge and some practicle, i dont think there is any need
to creat product or location in APO. and no need of the CIF model also.

do you ppl think , i am totaly under wrong impression or what??

if yes, i think i need to try everthing in system, i should not beleive on the books.

regards!!
pravin mukkawar

Step-by-Step Statistical Forecasting Using SAP APO

Learn a 15-step methodology for executing forecasting projects in SAP Advanced Planning and
Optimization. Understand the most common methods of statistical analysis. Learn best practices for
implementing these methods in practice.
Key Concept
Forecast strategies are used in SAP Advanced Planning and Optimization to decide how forecast values
are calculated. Selecting a method depends on characteristics of the data, including seasonality, linear
trends, and which data is emphasized.
Statistical forecasting is a strong feature of the Advanced Planning & Optimization (APO)
Demand Planning (DP) suite and a lot of companies look at this capability of APO for an
effective demand planning process. The recent version of APO (SCM 7.0) covers a wide range
of statistical forecasting models. However, mere availability of models does not ensure the best
forecast result unless they are used effectively. The first few questions that probably come to
mind for any company looking for such a tool are:
 What are the best practices for using the APO statistical forecasting tool?
 How do I know which model best meets the needs of my business (as there are lots of models)?
Based on our experience of executing such statistical forecasting projects for clients from
different industries, we have put together a methodology for executing such projects. The
methodology is broken into 15 logical steps. We also provide a set of tips and tricks for effective
use of this tool and a set of case studies.
Step 1. Finalize the Scope of the Statistical Forecasting Project
In any statistical forecasting project, the common tendency is to do statistical forecasting for
every possible stock keeping unit (SKU) that the organization sells. However, it is important to
finalize the scope of the project for two reasons.
 Statistical forecasting does not give the desired result in certain cases
 Sometimes being selective gives quicker results
 Forecasting does not give the desired result for certain SKUs, including these:
 New SKUs for which very little history is available and which do not closely mimic the sales
behavior of existing SKUs (where like modeling cannot be used).
 SKUs that the organization wants to discontinue in the next few months.
 Purely promotional SKUs that are sold for a very short period during the year, such as Christmas.
 Highly seasonal SKUs for which very little history is available. Ideally a statistical forecasting tool
needs at least 24 to 36 months of history of such SKUs to identify seasonality.
 SKUs for which there is a permanent capacity constraint (i.e., the organization always sells less
than the original demand of the SKU as it has a constraint in production capacity).
 SKUs with highly variable or unpredictable supplier lead time and production lead time.
Variability during replenishment skews the actual demand and makes forecasting unreliable.
From our experience, it is also important to be selective while starting such a project. A quick
ABC analysis of SKUs based on sales volume can be handy here. Identify those SKUs that
contribute 80 percent of sales and put most of the effort of model selection in these SKUs. If by
better statistical forecasting, the forecast accuracy for these SKUs can be improved, it will have a
positive effect on the overall business and can deliver quicker results. While in the long run,
forecasting needs to be extended to all SKUs, it is always better to start with A and B category
SKUs.

Adding a new keyfigure to Planning area is


possible when it is active?
Hi,

For a new requirement, i just create a key figure in dev system.

And as for the procedure i have de-intialized the planning area and added the new key figure to
planning area. and then to planning books. my purpose is solved.
The problem arise when i transport the same to Quality system. I am not able to see the knew
keyfigure in the planning area in quality. i expected that, even the planning book is active in
TAQ, the keyfigure will be added when it comes with Transport.

Do you think my expectation is wrong? to be frank, when it comes to production we can't de-
intialized the planning area to add a keyfigure, as you know how risk is involved in this process.

Is there any way that we can add keyfigure wigh out de-intializing?

Thanks in advance

ARUN R Y

Hi Arun,

Two things here.

1. When you transport, you have to transport the key figure that you added first to Quality and
then transport the planning area. You have to follow the sequence.

If you have already transported the planning area and did not transport the keyfigure, then send
another transport to Q system with key figure and then re import your planning area transport
and this should solve your problem.

2. Effective SCM 5.0, for adding a new keyfigure, you need not deinitialize the planning area.
You can go to /N/SAPAPO/MSDP_ADMIN right click on your planing area> change key figure
settings and when prompted Yes or no, click YES and there you can add/remove the keyfigures.

Don't forget do a consistency check on planning area after this just to avoid any inconsistencies.

HI Venkat,

Thanks for your mail,

1, I have followed the Sequence like first i moved the keyfigure and then planning area.

2. I am able to see the keyfigure in InfoObject catalog in the Planning area Rt hand side in Q
system.

3. And i am working on 4.1 version. the suggestion you made ican't find in my version.
So, if i de-intialize and assign keyfugre in Dev system, is it requires to repeat the same in
Quality also!!!

i expected this during the transport moving itself it will assign to planning area in Quality...

Suggest me how can i move on this now. is the only way agian de-intialize in quality and
assigning the same and repeating the same in Production?

Thanks in advance

Arun R Y

Hi Arun,

Adding key figures without deinitialization of PA is only from SCM 5.0. Since you are in 4.1 you have to
transport the key figure first and then the planning area. That is your only option.

Your QA system is usually closed for changes so even if you deinitialize, you may not be able to
make changes to planning area. It is also not advisable to do that.

Hi arjun,

This is very useful for me, after long period got this.

babu

Hi Arjun After activating Planning Area u can add Extra key figure it is possible in 7.0 version
without desalinization of PA

I am not sure if you can transport a PA if it is already active. I just tried it using v7.02 and I got errors
while transporting it. Once I deactivated the PA the transport worked seamless.

Adding new keyfigures to active planning


area SCM 7.0

Hi ,

I wantt to add additional keyfigures to the active planning area in DP .We are using SCM 7.0 and
iam aware thet we can do it either by right click on planning area -->Change Keyfigure settings.

I have couple of questions.


1.The planning area in DEV and QA consists the additional keyfigures required by the business
and not in Production,if i transport the request from QA to Production what will be the
impact,and one more point here in Production there are additional keyfigures which are not in
QA.When i move the Request from QA to Production wiill there be any impact of keyfigures
which are already in production.

2. By using the program /sapapo/ts_lcm_parea_change2 do i need to take planning area backup?


Some suggest taking backup and some suggest to directly add keyfigures.Which is the best way
without loosing existing data, is it recommended to do directly in production.

Please reply with the possible best options whether to retransport the request or directly add key
figures to planning area.

Regards,

Dhanunjay.

 1437 Views

Average User Rating


(0 ratings)

Dhanunjay,

If you have KFs with data existing in Production. I would suggest two options.

1. If you do not want to take a back up of PA, then do not use the transport option, add the KFs
in production

2. If at all you want to use the transport path, then I would suggest you take a Back up and then
move forward.

Also you can think you do not have time constraints. Use this as a good opportunity to get started
with having a successful Back up process in place, since, anytime in future you want to transport
PA related changes, you will be presented with same issues time and again. For future
enhancements too, this will come a useful tool.

Hope this helps,

Hari Vomkarey

Hi Dhanunjay,
I am also trying the same. When I try to transport the planning area to QA, newly added
keyfigure was not transported into QA system.

Please let me know whether you deactivated the Planing area in QA and moved
through transport request ?

Even in development deactivated the planning area, added the key figure and send to
QA. it didn't works.

It looks we have to deactivate the QA planning area before move the Planning Area from
Dev to QA. For this approach we need to have back up and can proceed.

Now directly added the new key figure in QA and send only the Planning book with
newly added key figure from DEV to QA.

It is working nicely in QA. But still not yet added the new key figure in production. But
as per current trial, everything works fine in QA.

For any approach having a backup is always best.

You have mentioned there is difference between DEV/QA and Production Key figures. I
am not sure but if you move the Planning area through transport request it will move the
key figures only in the DEV/QA key figures. So I believe this impact the key figures
which are not in production.

As in SCM 7.0 we have option to add the key figure without deactivating planning area,
it is better directly creating key figure in production. Next week I am going to try in
production. If the threads open at that time I let you know

Regards,

Saravanan V

Hi

If working with SAP SCM 5.0 or higher you don't need to deactivate the Planning Area.
Directly add the new key figures by going to the planning area Extras-Key Figure
Settings

Regards

Dam

Hi Damodhar,

First of all, apologies to jump in to the middle of the thread with a question. It
seemed more appropriate here.

I am working on SCM 5.0. Would there be any reason for this 'Key Figure
Settings' option to be grayed out?

1. I have 3 different SNP PA's and for one of them the 'Key Figure Settings'
option is active and for the other 2 PA's it is grayed out...no specials here...all the
three of them are copies of the same standard SNP PA

2. Even for the PA for which it is active, when i try to add a KF with this option
without deinitializing the PA, it is throwing up a strange error "Planning object
structure ID Z9SNPFCS is invalid". But the POS connected to this PA is
9ASNPBAS...Any ideas why this Z9SNPFCS is coming up?

Hi

Could you please run the consistency check for MPOS and try

Regards

Dam

Mad- sporadic demand low volume denad


Mse- A type product
Transfer global settings fiscal year variant

Вам также может понравиться