Академический Документы
Профессиональный Документы
Культура Документы
Integrating a Pricing/Profitability
Solution with an ERP Platform
Preface
Introduction
The upcoming pages compose a White Paper contribution to Deloitte’s Consulting Knowledge Exchange (KX)
database. The learning experience we decided to abstract, generalize and formally document on a contribution like
this has being a result of joint work executed at a client from February to June 2009.
We have agreed on not only focus on the technical aspects of the Know-How acquired so it could be leveraged by
any other practitioner assigned similar work. We have agreed on providing a holistic approach to our learning based
also on the value that from a business standpoint an integrated Pricing/ERP solution can bring to an organization.
Yet, we have also formally documented lived experiences and tips so others can take advantage of what we learned.
Then other practitioners could commit different mistakes and not the same ones we were able to document but
others which could be also documented in successive extensions of this work.
The intent of this paper could be perhaps far from having followed the most rigorous academic standards of Paper
writing. Instead, it has being written in an attempt to be more of a pragmatic tool helping a practitioner who has
being assigned to complete related work. In that direction, its story boarding follows this line:
- First, provide context in terms of business purpose and value for the client organization. This is, it
tries first to answer the question: “What is on this for the client and why integrating a pricing
solution with an ERP platform is crucial to deliver value?”
- Then go onto the Revenue and Pricing/Profitability Fundamentals on a as much simple and
pragmatic fashion as possible. These would be the essential concepts to rely on and to have very
clearly understood so a successful and quality integration work can be completed
- Then get deep on the “How” to go about the Integration work. Both the Outbound (Pricing Solution
ERP) and the Inbound (ERP Pricing Solution) arms of the overall process have being covered
- Finally, document Know-How in terms of Testing, Deployment and Lessons Learned. The Lessons
Learned section would be possibly one of the most value-adding ones of the entire Paper for our
Practice and that was the approach that was followed to write it down
We certainly expect to have provided with this Paper a truly useful contribution to our Pricing Practice initially and
then also to the broader Deloitte Technology Service line. We would modestly aim towards that direction because
we believe the overall approach we have documented as a contribution could be applicable in reality to any
integration type work.
Objectives
The objectives of this contribution are detailed as follows:
• To provide a “package agnostic”, comprehensive and holistic methodological framework to guide any
future Pricing onto ERP Integration work stream part of any Revenue/Pricing Management engagement.
Vendavo ® and SAP ®-based illustrations of all the concepts, techniques have being provided throughout
this Paper as they relate to the most recent experience we have had. However, the core tools, techniques
and methods part of this approach have being documented thinking on them as of relevance and use no
matter what specific packages are adopted as Pricing and/or ERP solutions
• To base such methodological approach on a solid set of Concepts and Fundamentals on Pricing,
Profitability, Revenue Management, data flow, data design and deployment. The more solid, clear and
articulated in the mind of the practitioner these concepts are the more productive the corresponding
experience will be
• To transmit truly empirical/based on day to day tirelessly work, tips, advice and recommendations to our
colleagues. There is nothing like having done it and we could not avoid the desire of live how gratifying it
is to let others know what we experienced so they can be more productive and even more value-adding to
their clients
Tip:
This sign will indicate a section on the Paper where a recommendation/advice based on actual/empirical experience
on an assignment being provided. The advice will be know-how oriented
We believe it could not be a totally off-mark thought to say that, to integrate A and B, you do not necessarily need to
be an expert in doing alone/separately very well and very specifically A and/or B. But to successfully integrate, you
have to:
- Have the willingness to dig onto the intrinsic nuts and bolts of A and/or B to be able to at least understand the
basics of what A and/or B “alone” experts will be telling you daily
- Understand related business process very well before getting onto the technical aspects of the work
- Understand the fundamental business-process related concepts involved on each “arm” or “side” of the integration
work (data coming in and/or data coming out) to be able to communicate yourself on a clear, non-ambiguous,
concise and articulated basis with the many counterparts you will be contacting and talking to all along
- Meet and know people every day and establish rapport and successful communication streams and relationships
with each one of them
This Paper contribution has thought to be “Package Agnostic” because when a practitioner is being assigned
integration work she or he will not be responsible for either “A” or “B” individually and separately but to ensure that
the outputs and inputs from/to A/B end up where they need to be. If the related methodological approach is clear and
if the practitioner can develop the skill to identify and build productive working relationships with the right people
on those projects responsible separately for “A” or “B”, success being “package agnostic” will be possible. It is
obvious that knowing more on A or B would help but such knowledge would at best be a necessary but not
sufficient condition to successfully integrate.
Problem Statement
We were able to articulate the following bullets in an effort to state the problem we attempted to resolve with this
contribution:
• To establish a conceptual foundation for any (Functional/Technical) design of an interface between a
Pricing/Profitability solution (Any distinct package) and a given ERP platform (SAP, Oracle, LAWSON,
In-House Legacy system, just mention some of them)
• To contribute with an approach that will be irrespective of any Pricing/Profitability optimization tool and
could be leveraged on any Pricing-related engagement
• To contribute with an approach focused first on understanding business processes, using Industry Print ©
as a tool to model related business processes
• Progress into the technical features of the approach maintaining it “tool-independent” on the
Pricing/Profitability side mindset
From a practical and value-adding perspective, a client’s past execution (past sales and past incurred (direct,
indirect) costs to be able to realize those levels of sales) historical data will feed the present process of proactively
set products/services prices trying to increase and if possible maximize profitability. Automating this process
towards efficiency will provide significant value to the business as such dynamics will feed an enriched and more
real, adapted and effective in terms of increased profitability-oriented decision making process.
Described as such, that is a continuous (closed-loop) process for which automation will provide the value of making
possible a decision making that takes into account with more and more accuracy and speed the reality of how sales
move and how costs are incurred on a weekly basis for example. After all, the organization prices will be a function
of its market, its competition and its operational direct and indirect costs. If an integral solution such as the one for
which a know-how is described on this paper is implemented successfully, the client organization can start
implementing pricing and profitability (Possibly cost reducing) measures which will provide them with an edge
against their competitors.
On Figure (1) below, the following value-adding drivers to be able to efficiently and effectively take Price and
Profitability decisions can be identified. The figures describes an example based on a typical CIP (Consumer and
Industrial Products) Value-Chain (Raw Materials Manufacturing Process Retailing Process End
Customer)
- Decision Making Based On Transactional Analytics: A decision making process based on the usage of a Price
Analytics tool which will provide very close to real time information on how sales move and cost related to those
sales move.
- Decision Making Based on Demand Modeling Techniques: The usage of a Price Policy Management tool which
will be able to receive and incorporate input from current product/services demand trends and to use it efficiently to
formulate sound Price Setting Policies.
- Decision Making Related to Develop and Put in Place an Optimized Price Strategy: The use of a Price Policy
Management Tool which will accurately and efficiently feed Transactional data onto the Price Analytics process that
precludes the establishments of sound Pricing Policies. Increased profitability will be the ultimate goal being
pursued.
- Effective in Terms of Profitability Deal Making and Execution (Enforcement of Price Policies in Place):
There will be a lot of value to be realized on being able to monitor sales representative behavior not in line with
established Pricing Policies. The usage of a tool which, based on strong and accurate Price Analytics, allowed Price
Management decision makers to establish Policies geared towards realizing increased profitability. However,
enforcing those Policies will ensure value realization hence, being able to rely on a tool which will make the
enforcement process more efficient closes the complete loop of value drivers.
Technical Contribution Geared Towards Helping You to Deliver Business Value to Your Client
(TCV) Raw Materials CP / Manufacturer Retailer
A practitioner was possibly assigned to a Role to design and implement the Integration component of a Pricing
Solution with a given underlying ERP platform based in her or his technical skills/related past experience. That will
happen with frequency. Nonetheless, increased possibilities of success on the assignment will be directly related to
Retail
the capacity of Optim ization
the practitioner to understand the importance of the results of her or his work in the framework of Consum
what the client is expecting in terms of value.
Assess risk, determine
“TCV”volum
or “Total Client trade
Value” in the case of a Pricing/Profitability engagement would be in direct relationship
e/price -offs
with how clearly Revenue Management Decision Makers will be able to derive new Pricing Policies, decisions in
terms of Costs reductions and in terms of how to more effectively in cost and increased revenue to promote its • P
products or services. Clear TCV enabled by a solution whose Integration component ensures data flows in and out
Final
• C
Current Position
from the solution smoothly providing an accurate transactional and price-setting picture of the organization’s
8bps
All Constraints
• C
5.6bps All Constrains Refined
business is the ultimate goal of a sound integration component. Not necessarily a solution
Determine elasticity and embracing
endthe best-in
breath technology when decisions of that type did not consider cost, effectiveness and ease of use would be the
desired one. use suggested prices
12.0 12.5 13.0 13.5 14.0 14.5 15.0 15.5 16.0
A practitioner who will go above and beyond its technology-related responsibilities to challenge every move in
terms of how much impact would that decision have on value delivery will have a greater probability of successfully
deliver a solution of this nature. She or he will also allow the client to think in Deloitte when speaking with other
organizations trying to benchmark with them on how to use Pricing/Profitability tools to gain an edge on the market.
Tip: Ask yourself always, how much efficient and productive will my client be in terms of realizing trends that
are hurting/would improve its profitability because of the usage of your final deliverables
We will start this section just by quoting the main headline presented on Figure (1) above. This headline says:
“The Value Adding Business Imperative for which a quality Interface implementation is key: A holistic
approach which utilizes data from multiple sources can achieve greater profit potential by providing insight and
managing margins proactively”
The most important concept expressed on the headline above is in our judgment the one of being “proactive”. When
complex decision making processes need to look at multi-variate related information, data will need to be gathered
from diverse sources but would need to be brought together and presented to the decision makers on a condensed yet
complete and intuitive way. But adequate speed and accuracy are two characteristics that must define the process of
bringing such information to the eyes of the decision makers so decisions are taken at the right time.
Without effectively integrating data (on an automated basis) that will come from diverse sources and presenting
such data in a way that it generates enlightening/trend-discovery information on the minds of Revenue Management
decision makers taking proactive decisions will not be possible. Hence, Value Delivery will be compromised.
A sound work integrating such data sources is the cornerstone to provide a data platform that will make possible the
application of a “holistic approach towards value delivery” as claimed so far on our contribution. In synthesis, there
will be no value realized without being able to smoothly put “everything together”.
Though technical skills will definitely be important and help in every step of the way, a few others perhaps (soft)
skills will be at play when having the responsibility of doing integration work. Here is a tentative list of those traits,
conforming perhaps the “right attitude” to tackle a responsibility of this nature:
- Calm and serenity: At times, you will feel overwhelmed with client and firm representatives’ expecting from you
answers to many questions on domains you are not directly involved with. That is, you will depend from others to
help you but you will be asked questions as if you were the primary responsible of those other tasks which are part
of the final result of the integration work you are doing. That is simply because integration is done when the “parts”
plus the “whole” are done. Even when your primary responsibility is the “whole” its completion will not be possible
without the “parts” being done. This situation will create a lot of anxiety but you need to stay calm, be prepared by
doing your due diligence in contacting the people you depend on and then be able to, with serenity, explain so to
your Management and client representatives.
- Persistency: People both on the “A” or “B” sides will say they will help you out but remember that even though
they know they work part of a team they will put their own responsibilities and whatever results they would be
measured by first on their Agenda (As anybody would do). The secret is not to get discouraged when seeing how
hard getting the information you need to understand what needs to be done is a situation that repeats time and again.
Pressing forward harder but with tact every time will be key.
- Due Diligence: As you will be asking others for help you will be unavoidably compromising part of their time. So,
you cannot just simply appear asking for help without having done your homework and being ready to explain the
other person what is it that you will be doing to help him help you. Be sure you will show what you have done on
your own to anticipate the answers you are expecting from your other teammate from which you need know-how
and expertise she or he only has.
- Communication: Be frank and straight-forward with the various Management level representatives on your
project as quick as possible. Do so whenever you feel things are not getting the needed traction or a major difficulty/
road-block brings an unexpected hurdle. Try to listen as much as possible, especially at the beginning of your
engagement during the “sponge” few days you may have. You will feel that you will possibly not have enough
sponge time but do not get discouraged by that. Communicate often, especially when you are in trouble in terms of
advancing with the speed you consider you should advance. If you are still in the process of prove yourself on the
engagement and you do not have the necessary confidence and rapport with your Project Manager rehearse what you
need to communicate with that person with a buddy of yours in the project or with another peer within the firm you
have confidence with. You can do it on the phone too. For instance, it may have being the case that you went to
training with a peer of yours before getting staffed on the project and you gained rapport with that person. Rehearse
with her/him what you need to communicate but do so as quick as you possibly can.
- Delegation: Delegate if you can, however, always support and be ready to answer any type of questions from
whom you delegated in. If you cannot delegate, talk to your Manager and be clear on what you feel you humanly,
after a significant/pushed to the limit but humanly possible effort, will be able to deliver. Sometimes you may be
forced to delegate and train the person you will be delegating a task into at the same time because deadlines are set
and the person you may have available to help you needs orientation before she or he can produce for you. If you
have to do the job yourself be sure you set expectations clearly with your Manager. Strive to commit with deadlines
but be also frank when expectations from Management based on what has being committed to the client do not
necessarily match with the reality of what seems possible to be delivered with the required level of quality.
Tip: Spend time knowing who is who at your project. You will probably need to figure out whom the right
parties to talk to among 200/300 team members size projects are depending on the subject at hand. Build
relationships every day. Your success is based on getting the right information from the right people, not on being an
expert in either A or B if integrating A and B is what you are doing
Tip: Absorb information progressively; do not try to know all at once. The details and know-how you are
looking for to put it to work will build up on themselves incrementally
Price
The following is a generic/succinct description of the so called “Revenue Management Optimization Model”. We
have described this process in detail because implementation tasks part of the Integration work stream bring to life
this model and will allow your client to realize value once it is institutionalized within the organization.
We found more practical to explain the model as a series of sequential steps:
Step (1): Assume a set of Price and Profitability Policies have being already defined, even before your integrated
solution has being deployed. Daily sales execution will generate accumulated operational data you will want to bring
onto your integrated Pricing/Profitability Optimization Solution periodically to then try to predict and manage the
future based on better and more effective solutions based in turn in what happened in the recent past. Such historical
data will need to be transformed and “allocated” (The “allocation” concept will be explained shortly very
extensively and will be re-visited later on this contribution) so your solution is able to build up graphs that will help
decision makers to identify trends and decide upon them. The fundamental cornerstone of this step is to be able to
“Allocate” accumulated sales and cost data at a transactional line basis.
Step (2): Revenue Management Professionals at your client organization will perform “Pricing Analysis” (PA)
activities to identify trends and synthesize recommendations to base decisions in terms of:
- Pricing Strategy: Increase prices a bit to those clients generating most of the revenue, for instance. A
decision like this will have an immediate impact on the organization’s bottom line
- Cost-to-Serve (CTS) changes: Promote less on clients generating less revenue or behaving less
according to what was agreed to them. Or, to the contrary, promote more on low revenue generating clients
if identified that they have being supported less historically
- Outliers identification: Be able to support the application of Pareto-rule concepts and decide
accordingly. For example, if the organization incurs in significant costs to drive its trucks to clients in
distribution routes which are not generating a sound volume of sales it may decide to reduce significantly
monthly visits and, instead, visit more frequently those customers driving higher volume of sales
Step (3): Based on the results of Transactional Price Analysis activities, major trends are identified and those trends
will feed the Price Setting/Policy Preparation activities. Price Policies incorporating decisions based on the results
of the Profit Analysis Step (2) will be published and data on a new set of “Price Tables” will be interfaced onto the
corresponding organization ERP system.
Step (4): Sales Representatives will be provided with a tool which, now, will enforce compliance with the defined
and published Pricing Policies. Deviations will be detected automatically so proactivity will be at play. Outlier-type
of situations will be detected and the organizations management will be able to act upon those situations before a
contract with the end client is signed.
Step (5): New prices will base new sales data getting accumulated daily. We would go to Step (1) above and begin
the entire cycle again.
The following is a succinct definition of Price Waterfall blended as a result of being part of Pricing/Profitability
training and technically leading one Pricing Solution ERP integration work stream:
“A graphical representation of the different components that build up the way a given organizations charges
end clients for its products/services as a function of its different cost and profitability financial elements”
A Price Waterfall for one organization is the cornerstone figure helping Revenue Management Decision Makers to
communicate using the same language on the decision drivers related to set the company’s pricing policies. On the
“Y” axis of this graph the “level” of a price component on a desired currency is established. On the “X” axis the
different “components” taken into consideration in the process of setting up the final price of a product/service are
listed from left to right. The different components that structure the final customer/retail price are called Waterfall
items or “elements” (WFE’s). Each bar on the graph identifies a “WFE”. A given bar could be labeled with the
“red” color or not. When a WFE is labeled in “red” it means it affects the accumulated price with a (-) (negative)
sign. The other (2) types of bars representing “WFE’s” are called “Price Points”. “Price Points” are represented on
the example graph below in “grey” color while WFE’s are either “green” or “red” colored bars. A “Price Point” is a
derived WFE, that is, on its own it does not represent a price component but it is merely the result of an addition of
subtraction of two other WFE’s. “Revenue reducer” WFE’s are normally WFE’s that reduce the price or
“negatively” adjust the accumulated product/service price.
(6) Main types of WFE’s will be discussed in a subsequent section of this Paper:
- Price Setting Items: Determined based on Demand Predictions, Elasticity Models, Market/Industry trends,
competitor information, external factors, Governmental policies, among others.
- Negotiable Items: Included on end-client invoices to adjust (Up or down) the initial Price Setting total. These
elements are negotiable with the end-client typically based on behavioral factors. Example: If a client hits a certain
sales target on a given period of time, it may be entitled to a discount on the Price being offered for the products
being sold when such target is hit
- Invoiced Items: Simply included on end-client invoices
- Revenue Reducing Items: Typically items related to incentives on promotional activities such as rebates for
example and other discounts given to foster increases in sales
- Chargeable Items (CTS): Constitute a very important part of the Waterfall. Relates to elements that encompass
how much does it cost the organization to put its products or services available for sale in the market (Transportation
Costs, Other Promotional Costs, among many others (Details will be provided further on this contribution)
- Cost of Goods Items: Constitute a very important part of the Waterfall. Relates to elements that encompass how
much it costs the organization to produce its products or services (COGS (“Cost of Goods Sold”) plus incorporating
profit margins appropriately
As it becomes apparent by now, delivering value from the use of a tool such as a Price Waterfall stems greatly and
primarily from the ability to properly populate the underlying data that could accurately reflect the reality of the
business as frequently as possible. Transactional data builds the Waterfall and it is used and processed, at an invoice
line by line basis, to determine the relative values of each Waterfall element. Price and Profit Analytics allows for
the formulation of sound Price Policies which then are enforced also using the tool.
The Integration work stream’s responsibility is to bring the Waterfall tool to live so it can be used to provide the
value it is intended to provide.
Figure (3): An example of a Waterfall representation for a Food/Beverage CIP client type organization
The following is a succinct definition of the Price-MART concept. A Price-MART is more an Information
Technology driven concept rather than a business related concept. It is essentially a concept related to how Waterfall
data including multiple data “dimensions” (For a CIP Food/Beverage organization such dimensions could be
“Country”, “Region”, “Product”, “Market”, “Sales Organization”, “Customer” and many others) is organized
and stored on an automated database part of the Pricing/Profitability solution being integrated with the
corresponding ERP platform in question.
See Figure (3) for a graphical, very schematic, representation of a Price-MART
Figure (3) – A Price-MART Schematic Representation
The following are the main characteristics of the “Price-MART” which is the core data modeling/storage
component supporting any Price/Profitability Optimization tool:
• Follows same overall “data-mart” concept that we know from Data Warehousing basic definitions and
knowledge
• However, data stored specifically revolves around prices and about the process that conducts to
establish Pricing Policies. The main attributes of a line of sale plus how much did it
cost/profitability left a given invoice line are stored on the Price-MART
• Each line corresponds with one line of invoice part of final sales transactions involving the organization’s
products/services
• There is a section (at the beginning, labeled on the graph above as the “Derived Attributes” section)
composed of attributes which could be direct or derived (as a result of adding/substracting (2) WFE’s)
(Product, Customer, Sales-Organization related attributes).
• Attributes can be alphanumeric (Ex: Product Code) or Numeric (Ex: Base Price)
• The rest of the fields are “Waterfall” Elements, both actual WFE’s and “Price Points”
Part of the Integration work stream’s responsibilities is to perform all technical-related (Functional an Technical
design, implementation, testing and deployment) activities necessary to initialize and periodically update the Pricing
Solution’s Price-MART.
Let us recall, first of all, the definition previously cited of “Cost of Goods” type WFE’s:
- Cost of Goods Items: Constitute a very important part of the Waterfall. Relates to elements that encompass how
much it costs the organization to produce its products or services (COGS (“Cost of Goods Sold”) plus incorporating
profit margins appropriately).
Indeed, profit-margin related items will be WFE’s (Such as “Virtual Profit Margin”, see Section titled “Dissecting
Waterfall Elements: A CIP/Beverage Industry Application” right following this one for more details and for a
full understanding with real life examples of all the WFE types on a Price Waterfall) will be appearing on a typical
Waterfall after the actual “Cost-related” items.
Yet, the most important characteristic to be grasped on any Waterfall element part of an applied Price-Mart is the
fact that each line of the Price-Mart will correspond with the outmost granular sales transaction of the client
organization. Let us show a practical example from a Beverage Manufacturing company:
- Beverage on container of size (S) of (6) units each (Called
“SKU”) is sold on (5) different brands (B1-B5) to clients (Retailers on route [R] monthly). A client on a given route
is visited (N) times a month and once a month a consolidated invoice is submitted to the end client to collect
payment. Let us consider such invoice having a lay-out as follows:
Using as an example a CIP/Beverage organization application, the following figures illustrate more in detail how the
different “Sections” or “Categories” of WFE’s break-down.
Figure (4): Dissecting Waterfall Elements on a CIP/Beverage Industry Application (Part I)
W at
Figure (5): Dissecting Waterfall Elements on a CIP/Beverage Industry Application (Part II)
W at
C u sto
Figure (6): Dissecting Waterfall Elements on a CIP/Beverage Industry Application (Part III)
Wa
R eve
busi
Figure (7): Dissecting Waterfall Elements on a CIP/Beverage Industry Application (Part IV)
Wa
V a r ia
The Price Setting and Cost Allocation Processes: Feeding the Waterfall With Core Price Elements,
Adjustments and Allocated Cost Elements
tran s
It was mentioned previously that the automation of the “feeding” routines that periodically run to fill-up/update an
organization Waterfall (Price-MART) constitutes at least 80% of the overall effort involved on an Integration work
stream. This rule of thumb becomes valuable when deciding how to allocate also resources onto your work-stream
and set up priorities towards meeting deadlines. The work just described is called the “Inbound Arm”
(Transactional Data coming from the ERP platform onto the Pricing Solution) of the Integration work stream and
we have dedicated an entire section to provide details on all the nuts/bolts involved in designing and implementing
the “Inbound Arm”.
To put things in perspective, one person for 1-2 months being very conservative FTE could roughly, from the
moment the underlying data design is defined and finalized for the “Outbound Arm” (Price Policy data coming
from the Pricing Solution onto the ERP platform) take care of the functional and technical design corresponding to
the “Outbound Arm”. But you would be more than in need to delegate work on the “Inbound Arm” by groups of
WFE’s most probably. The amount of work and the uncertainty involved on the work until all information needed to
understand how to initialize and allocate is discovered could be much greater
Tip: Delegate the Outbound work on one person for a limited period of time. Be sure that person will be also
able to help you with the Inbound work. Be conservative when estimating the number of people you would count
with to complete all the Functional/Technical specification work related to allocation methods. You may need to run
a full blown shop developing custom code to accomplish this goal but the Analysis piece related to
identify/understand the data sources and collaborate with responsible teammates on the ERP side may take
significant time. Be conservative estimating this period of time because, typically, the information that you need will
not be as readily available as you may want it
Feeding the Waterfall with Core Price Elements and Adjustments would be part of the Inbound arm work. You
will be designing functionally and technically an automated process that will retro-feed invoice line fields such as
Base Price, Rebate amounts, and a number of fields which will be already available on each invoice. No allocation
will be needed for these fields because, as you may remember, each line on the Price-Mart should correspond with
each and every Sales invoice line. Attribute data such as Customer, Sales Organization, Region, Product, etc (Just
to mention a few fields) may need to be derived, however, such data will be queried from one or more tables on your
ERP system database (This regardless on how simple/complicated those queries may be).
Tip: Work with your Pricing Solution manufacturer specialist and, directly with Product Engineering to
procure if needed, the necessary adjustments on the Product itself to fulfill the client’s requirements. This is, if the
specialist working with you at the client site warns on potential performance issues on the Pricing Solution to be
able to have non-aggregated Price-Mart data you will need to stand firm and procure the resolution of that problem
on behalf of Core Product Engineering. Functionality provided and satisfaction of your client’s requirements so
Price/Profitability Analysis can be done at a transactional (invoice line) level are elements that cannot be negotiated
because value-delivery heavily relies on these premises. Keep requirements and technical limitations separately
analyzed and establish the right priorities with whoever helps you on-site from the Pricing Solution Manufacturer
side
Tip: One topic that it would be very good it does not catch you off guard is the determination of the actual
Transaction-ID field value to be recognized when loading data onto your Price-MART. You will need to be able to
uniquely identify transaction records. Be sure you diagram the situation and work together with counterparts
involved with the Sales/Distribution process currently in place supported on the client’s ERP solution. Spend time
maturing a solution for this issue as it will not be an easy one most probably. If for instance your work stream is part
of a large and brand new SAP implementation, it could be the case though that the client may want to maintain two
feeds in parallel: One from the legacy systems into your Pricing Solution and another from the successive
“ecosystems” in SAP that may start to be rolled out on a “scattered” basis. If this is the case not only the
Transaction-ID field determination and correct generation/population would be a challenge but also the unification
of Master Data-related ID-s (Customer ID-s, Product-ID’s, Organizational-Units-ID’s, etc) will be of consideration.
Using Industry Print ©: Overall Processes and Tasks To Conceptualize While Minding Your
Outbound Arm
Even though we will present one more time the diagram below also when talking about the “Inbound Arm”, we will
spend some time on the related terminology and trying to visualize the end to end thought process involved on any
Pricing ERP Integration work stream.
Let us first agree on the meaning of both “Arms”:
- The Outbound Arm: As a result of the overall Price Management process, Policy Tables are confectioned using
your Price/Profitability new solution as a tool. Data on these tables needs to be published and enforced to be
complaint with on the current ERP-side Sales/Distribution sub-systems. Such data needs to be periodically exported
from the newly implemented Pricing Solution onto the ERP system corresponding Master Data tables so new sets of
Prices can be enforced while issuing new invoices. All the functional/technical specification preparation activities
plus the subsequent implementation and deployment tasks revolving around the automation of this “export” process
conform what we have defined as the “Outbound Arm (OTB)” of the Integration work stream.
- The Inbound Arm: Transactional Data is generated daily on each and every invoice that is issued to end clients
when selling products/services. We would be referring to invoice data at its inception being recorded as a “Payable”.
Pure invoice transactional data (Item prices, discounts and others directly available on each invoice) will need to be
imported onto the newly implemented Pricing/Profitability optimization solution as it becomes available. At the
same time, incurred costs data on the operations period in question will need to be allocated over each and every
sales transaction line. All the functional/technical specification preparation activities plus the subsequent
implementation and deployment tasks revolving around the automation of this “import” process conform what we
have defined as the “Inbound Arm (INB)” of the Integration work stream.
We have developed a template (IP) file in Industry Print © to serve as a guideline to understand processes and tasks
involved on an Integration work stream of this nature. It will be provided as an Appendix. Two (2) relevant levels of
this template diagram have being presented in Figures (8) and (9).
Figure (8): High Level Industry Print © Integration Work Stream Process Diagram
Figure (9): Outound Arm Section of an Industry Print © Integration Work Stream Process Diagram (Process VE-
010-060) expanded)
Pricing Tables And Identifying Where the Outbound Work Will Be Focused On
On Figure (10) below an overall perspective of where work related to the Outbound work stream arm will be
residing logically is presented. The core activity will be to complete a mapping between all relevant fields on the
corresponding Price Tables on your Price Optimization solution and the Sales/Distribution/Master Data
Management ERP module tables and fields. This is why we have included the next (3) Sections on this part of this
contribution:
- How to go about identifying data containers (And their data definitions) on the ERP side where Price
Policy Data will be stored
- Factors to be considered when choosing a middleware solution to automate the Integration Outbound
Arm
- How to go about configuring both your Pricing/Profitability Optimization solution and your middleware
solution to the automate the Integration Outbound Arm
Figure (10) offers a very detailed vision on where the Outbound part of the Integration work stream resides. The
way Price Policy Tables are designed, that is, the underlying data model and the attributes and relationships that end
up shaping up which fields each object/table on one side will have and which fields will be part of the other side
tables/objects will greatly vary on which industry/business your customer is in.
We are able to cite as an example names of Price Tables for a CIP/Beverage Industry client application. For a
beverage producer and distributor the following elements are key in shaping up their Policy Tables:
- Product Brand (Different brands of beverage that are distributed)
- Size and Type of the container (“Bottle”) in which a given product brand could be delivered
- Agency in charge of distributing the different brands
- The Region and specific city Zone in which both the Agency resides and end clients reside
- Packaging types on which different brands are distributed (Packages of 6, 12, 24 bottles for example)
- Route (Within a Region and a Zone) in which a given end client is located on the distribution process
- Distribution Channel (In this case, a beverage producer differentiates between distributing beverage to small
business VS large retailers such as any of the large US retailrers for example)
- Market Segment (In this case, a beverage producer differentiates between different target markets following the
classical definition of “Target Market” (High-end/Low-end restaurants, bars, etc)
As a result, Price Policy Tables will establish for a given (Fiscal normally, or could be Year or month, there will be
always a “Beginning Date” and an “End Date” indicating the period of time over which a given Policy Table is
current) period of time and combining one or more of the attributes listed above as applicable Price Data covering:
- Base Price
- Size/Type Adjustments
On Figure (10), it has being signaled the area where the focus of the Outbound Arm work. (3) Main tasks are part of
this piece of the work-stream:
- Field Mapping with Sales and Distribution ERP module Tables and fields
- Configuring your Pricing/Profitability Solution to export Price Policy Data in a format that can be
read and loaded via automation on the ERP side
- Choosing and Configuring a Middleware component (Out of the box or customized) to automate
loading of Price Policy data onto the ERP Sales and Distribution module and making it available for
invoicing
Identifying Data Sources On An ERP Platform (Mapping Process): Digging into Sales and
Distribution Data
These data sources will greatly vary in terms of their specific data design from client to client. But typically, you
will be interested in finding documentary resources and contacting the right teammates (Either from the client, a 3 rd
party vendor or from Deloitte) on the ERP side who can answer questions related to that solution “Sales and
Distribution” Module.
Essentially, you will be interested in understanding in its entirety the underlying data model residing on the ERP
side and supporting “Sales and Distribution” functionality. This, is, basically, all entities and relationships,
expressed then as a set of tables and fields, supporting primarily the issuance and management of sales invoices.
In the case of SAP, it will be required to determine the set of tables, fields related to the conditional records that
will hold Price Policy Data. Normally such conditional records will be composed of “ZV (xx)” fields. Following is
an example of a mapping applicable to our CIP/Beverage Industry client example:
SAP Field On Conditional
Records Field On Your Pricing/Profitability Optimization Solution
ZV00 Base Price
ZV01 Type of Container Adjustment
ZV02 Price Zone Adjustment
ZV03 Agency-related Adjustment
ZV05 Other Adjustments
Tip: Write down relational (Entity-relationship diagram) schemas as underlying data related to “Sales and
Distribution” functionality will be typically relational (Oracle or SAP ERP platforms, for instance, rest over
relational databases). Even if the ERP platform may rest over a non-relational database, to first understand how any
underlying data model is being laid out trying to model the reality in terms of entities and relationships will always
help. You can still listen, understand or read documentation around how the data model is being composed on the
ERP side supporting “Sales and Distribution” functionality to then elaborate your “own” Entity-Relationship
Diagram – We suggest doing so as a best practice to ensure all the details of the design are understood. This “Tip”
will also apply for the “Inbound Arm” work, as it will be detailed on the following Section of this contribution
It will not be uncommon to find out that, regardless of the specific Pricing/Profitability (Package) solution being
provided to a client, the Outbound Arm related technical design and implementation work will be pretty much out
of the box. This is, the data model on the Pricing solution side will need to be customized so data exported could
then be loaded on the ERP side but such customization work tends not to be very extensive in comparison to the
Inbound Arm side of things.
In fact, Pricing and Profitability solution vendors such as Vendavo ® (www.vendavo.com) and Zilliant ®
(www.zilliant.com) have offered very precise documentation on how to configure the Pricing product in such a way
that Price Policy data could be interfaced/integrated successfully with ERP platforms such as Oracle or SAP.
In the case of Vendavo ®, pages 50-160 on this attached PDF Guide offer details on how to go about configuring
the solution to be able to interface Price Policy data (In the form of conditional records) into SAP
Note: To gain proper access to this material, when engaged on an assignment, please contact James Weaver
(jweaver@deloitte.com)
The essential steps to be followed to set-up Vendavo ® and get it ready to interface Price Policy Data follow:
1. Extract pricing meta-data from SAP ERP
2. Load meta-data into Vendavo (This is, schema or data definition data that came from SAP into Vendavo)
3. Define extensions to VPriceRecord (This is, making this step applicable regardless of the package being
used as Pricing/Profitability solution, to customize the data container you may have on the Pricing side to
be able to hold all the data elements required to interface onto the corresponding ERP platform)
4. Map source entities to VPriceRecord (In general and regardless of the Pricing package being used, the
product’s data model holds data on entities and those elements will need to be mapped onto the final data
container (For Vendavo it is called VPriceRecord) from which Price Policy data will be exported onto the
corresponding ERP platform)
5. Map entity fields to condition table fields (This will be an ERP package specific step and may vary on its
implementation, however, irrestrictive of package there may be the need to map the results held on the data
container for Price Policy records onto the values that will need to be loaded onto the corresponding ERP
platform (Sales and Distribution/Master) Price Tables)
6. Map pricing fields to condition types and ordered list of possible condition tables (SAP-specific step)
7. Define adapters for generating VPriceRecord and VConditionRecords (Irrestrictive of package one more
time, because you have customized the Pricing side data model to accommodate how data will be needed
on the ERP side, you will need to also customize the corresponding routine/job/event needed to export the
data and make it available to then be interfaced onto the ERP side)
8. Export VConditionRecord (The actual execution of the “Export job”, irrestrictive of package. It may
generate either a CSV file, an IDOC (NetWeaver ®) file directly or leave data onto a relational table)
Choosing Among Options For Middleware Usage and Configuration To Implement Your
Outbound Arm
Essentially, just because a given manufacturer of a Pricing/Profitability solution such as Vendavo ® or Zilliant ®
provides a full blown set fo software components to automate the Outbound Arm part of the Integration work
stream with SAP or Oracle, that does not necessarily mean using those components would be the best/most efficient
option.
It may be very well the case that your client will not count with anybody able to support SAP’s XI/PI integration
platform (Process Interchange ®) so the selection of such Middleware component (Unless already acquired for
many other purposes) may not be feasible or cost efficient. Or, simply, the frequency of update of Price Policy data
onto the ERP system is not high enough to justify going through the risk of incorporating another component onto a
probable already complex set of interaces implemented over XI/PI ®.
Perhaps just the implementation of a UNIX shell script which will upload Price Policy data directly over Oracle ®
tables or on temporary tables/objects on a scheduled basis (Every (6) months for example) will suffice. A thought
process like this, just evaluating the best option in terms of cost/benefit and Value will be highly recommended.
In the case of Vendavo ®, pages 90-150 on the previously attached PDF Guide offer details on how to go about
configuring XI/PI ® to be able to interface Price Policy data (In the form of conditional records) and on an
automated basis creating IDOC documents to be loaded in SAP.
Using Industry Print ©: Overall Processes and Tasks To Conceptualize While Minding Your
Inbound Arm
To recap:
- The Inbound Arm: Transactional Data is generated daily on each and every invoice that is issued to end clients.
We would be referring to invoice data at its inception being recorded as a “Payable”. Pure invoice transactional data
(Item prices, discounts and others directly available on each invoice) will need to be imported onto the newly
implemented Pricing/Profitability optimization solution as it becomes available. At the same time, incurred costs
data on the operations period in question will need to be allocated over each and every sales transaction line. All the
functional/technical specification preparation activities plus the subsequent implementation and deployment tasks
revolving around the automation of this “import” process conform what we have defined as the “Inbound Arm
(INB)” of the Integration work stream.
We have developed a template (IP) file in Industry Print © to serve as a guideline to understand processes and tasks
involved on an Integration work stream of this nature. It will be provided as an Appendix. Two (2) relevant levels of
this template diagram have being presented in Figures (11) and (12).
Figure (11): High Level Industry Print © Integration Work Stream Process Diagram
Figure (12): Inbound Arm Section (Master Data Interfacing) of an Industry Print © Integration Work Stream
Process Diagram (Process VE-010-010) expanded)
Figure (13): Inbound Arm Section (Transactional Data Interfacing) of an Industry Print © Integration Work Stream
Process Diagram (Process VE-010-030) expanded, level (1))
Figure (14): Inbound Arm Section (Transactional Data Interfacing) of an Industry Print © Integration Work Stream
Process Diagram (Process VE-010-030) expanded, level (2))
Figure (15): Inbound Arm Section (Transactional Data Interfacing, allocation procedures) of an Industry Print ©
Integration Work Stream Process Diagram (Process VE-010-030-020-030) expanded, level (3))
Figures (11-15) intend to represent on a very simplistic but nonetheless comprehensive way just the sequence of
phases/activities that are involved on the “Inbound Arm” piece of any Pricing/Profitability ERP Integration
workstream. As mentioned previously, on Appendix (2) an .IP file supplying the diagramas shown has being
provided as a template. Indeed, Industry Print ® could also used to document the essentials related to the Functional
part of both the Outbound and Inbound Arms of the Integration Workstream, in conjunction with other standard
deliverables obtained from our EVD ® database (In KX) to start up work on Functional and Technical
specifications.
Following a sequence documented in Industry Print ®, the remaining parts of this Section cover the following
details:
- In order to complete tasks and sub-processes: VE-010-030-040 “Build Pricing Solutions’s Price-MART
Skeleton/Structure” and VE-010-030-050 “Populate Price-MART Structure in Pricing Solution ”, some
classification of WFE’s in terms of their originating business processes and corresponding data sources on the ERP
side needs to be completed. Guidelines and Tips are provided in this regard
- Some Tips/Recommendations are provided around the sub-process VE-010-030-020-030 “Allocate Raw Cost
data at an Invoice Line Level” when a clear disctinction between “Allocation” and “Aggregation” is made. A
practical example extracted from a CIP/Beverage Industry client application to allocate “Transportation Costs” is
detailed step by step (IP tasks shown above VE 010-030-020-030-010, VE 010-030-020-030-020, VE 010-030-020-
030-030 and VE 010-030-020-030-040)
- Finally, the overall automation process around VE-010-030 (Complete “Inbound Arm”) is covered.
Recommedations and Tips are provided. Examples illustrating know-how on how to tackle Functional and Technical
specification work and implementation in terms of tool usage VS custom development.
Classifying Your Waterfall Elements To Facilitate Data Sources Identification On the ERP Side
When working on the Inbound Arm, the Practitioner leading an Integration workstream of this nature will need to:
- First understand the current Waterfall design, that is, all the WFE’s business meanings in terms of revenues and
costs. Trying first to understand, in general, what each WFE is attempting to measure. Maturing related concepts
may take a few days. To expect grasping every concept too quickly would not be realistic.
- Then, group WFE’s by underlying Enterprise Resource Planning macro-process. If in the context of a large SAP
implementation for example, and around specifics on the client’s Industry, the practitioner will need to identify all
the “End-To-End” business process at a higher level first that are being part of the SAP initiative. Indeed, it could
make sense to see both more than one WFE tied to one macro-process and one WFE indeed tied to many macro-
process (That is, involving data to be processed and transformed which is involved or it is part of several business
macro-processes, processes or sub-processes). Here is an example:
- For a CIP/Beverage industry (Manufacturing) client who manufactures and distributes beverages these
are some of the end of end process to be identified:
- Product to Profitability (P2PR) Process (This is typically the macro-process of which your
Pricing/Profitability solution implementation effort will be part of)
- Product to Client Process (P2C) (The “Distribution” of products/services to clients “door” or
“distributor’s location depending on the distribution channel being utilized)
- Product to Payment Process (P2P) (The “Invoicing” process of goods sold)
Tip: Be aware that you may be very well contributing also with the workstreams that relate to implement the
new systems you may be trying to integrate with. For example, as a result of the process of digging onto the ERP
processes and tying them to your WFE’s, you may help Analysts implementing a new ERP platform or maintaining
an existing one to anticipate/discover design pitfalls, weaknesses or simply opportunities for improvements that may
be valuable. Bringing this type of perspective when you are trying to communicate with others to gather the
information and most importantly the knowledge you are looking for may be benefitial. Anticipating problems will
always be better than having to react to them when it is already unavoidable not to see their consequences.
Going back once again to our CIP/Beverage Industry reference example, a sample classification of WFE’s prior to
map them to “end-to-end” business macro-process is presented below:
• Sales and Distribution Data: All the following data elements will appear on final sales invoices:
• Price-related fields (Core Price elements plus/minus governed by Policy Tables (See page (21) of
this Paper for a sample list of possible fields)
• Discounts (Governed on Policy Tables, example on Paper’s page (21))
• Adjustments (Governed on Policy Tables, example on Paper’s page (21))
Note: Populating WFE’s categorized on this classification corresponds to the execution of the IP tasks VE-
010-030-040 “Build Pricing Solutions’s Price-MART Skeleton/Structure” and VE-010-030-050
“Populate Price-MART Structure in Pricing Solution”
Here we present now one example of a Mapping exercise between WFE’s and ERP Macro (end-to-end) processes:
Tip: Make sure these (2) concepts are clearly differentiated and understood by all counterparts you have a
chance to interact with from the very beginning of your engagement. Allocate is to “break” and distribute, to go
from big to small, to distribute, not to gather or reunite. Aggregate is to “add”.
In Pricing/Profitability, it makes sense to “aggregate” once you have “allocated” but not before. If a set of data is
already available (line by line) on a sales invoice, there is no need to “allocate” that data. You typically allocate
costs as totals for costs are accumulated on your ERP Financial Module (In FICO © on SAP, Financials in Oracle)
but costs bookeeping occurs in parallel with sales realization. Allocate is to “spread-out” a large amount of costs
among every single transaction line realized.
You can though, on historical Price Analytics data, aggregate already allocated data and keep reasonably recent
Profit Analysis data “allocated” to the lowest/most granular level which is at the invoice line. You would then
perhaps be contempt looking at the past at a greater level of aggregation but after some time. You cannot
nonetheless do any type of value-adding Pricing/Profitability analysis after allocating using as an input data that is
being already aggregated. You can aggregate for Reporting purposes but you need to be able to drill down to the
transaction while doing Pricing/Profitability analysis or you will not be able to provide your client with the
possibility of detecting the root causes of relevant behaviours or opportunities for increased profitability.
We are presenting below an example on how to allocate costs in MS Excel ®. The case in question, once again,
revolves around a beverage Manufacturing company which distributes its star products (Beverage brands) through
end clients (Restaurants, bars, mom and pop shops, grocery stores, billiards, etc) from Distribution Agencies by
visiting these clients using trucks that depart through different routes from the different Distribution Agencies.
The problem: For one (1) week of sales transactions that is taking place the first week of July, and based on
accumulated Transportation costs (Gas for trucks, truck repair and maintenance, truck lease, etc) incurred by a group
of distributing Agencies in a city during the complete month of June, allocate those costs to each and every line of
sales realized during that week
Recognizing the Sales and Cost drivers that make the most sense: Once on the engagement you will become
familiarized with how client’s products/services distribution happens and with the variables of decision involved. In
this case:
- The different routes that are covered from a given Agency
- The average number of times a given client is visited within a month of operations
- How much of each brand of beverage has being sold (Invoiced) to each visited client in the period in question.
Because trucks wear and tear may vary with the weigh of the products transported, the number of Liters of beverage
sold rather than the amount in cash of products sold becomes of interest for this allocation exercise.If you are
provided with sales totals in cash (US$), then you need to be able to find an equivalence between products and liters
to bring up figures in Liters as needed
- How many times within the week in question a visit is considered successful (Leads to the issuance of an invoice)
- How much freight cost has being recorded on the past month for operations conducted from each Agency
Route-ID Description
1 Mex1-SF1 Route
2 Mex1-SF2 Route
3 Mex3-SF3 Route
Strategic
Business Unit Agency Total AMT spent per
(SBU) Name Month/Year Cost-Element month
Mexico City 1 Santa Fe 1 Mar-08 Gas/Truck $10,000.00
Mexico City 1 Santa Fe 2 Mar-08 Gas/Truck $25,000.00
Mexico City 3 Santa Fe 3 Mar-08 Gas/Truck $10,000.00
Mexico City 1 Santa Fe 2 Mar-09 Rep/Mainten. $23,000.00
Mexico City 1 Santa Fe 1 Mar-09 Rep/Mainten. $10,000.00
Mexico City 3 Santa Fe 3 Mar-09 Rep/Mainten. $18,000.00
- Sales Data detail:
NET
Customer FINAL #SKU's NET Amount L/SKU NET
ID#/Route Sold-On-Date Product ID# Price/SKU Sold SOLD Relationship LIT/SOLD
0001-1 3/1/2008 Beverage (1) $5.00 2500 $12,500.00 2 5000
0001-1 3/1/2008 Beverage (2) $6.00 6000 $36,000.00 2.5 15000
0001-1 3/1/2008 Beverage (1) $5.00 10000 $50,000.00 3 30000
- Pivoting activity: To “pivote” is to summarize and rationalize base data you have gotten to come up with a series
of “coefficients” or factors you will finally apply (multiple or divide, as it becomes applicable) to each and every
sales transaction lines to arrive to the final (allocated) values of the WFE you are trying to allocate
- Overall Allocation Method: (4) basic steps are outlined below to be applied when allocating “Transportation
Costs” in this particular exercise.
Tip: “Allocating” could also be be much simpler than what this example seems to suggest. You may for
example be given, for another WFE, Total Costs by Customer and you would only need to summarize sales per
customer on a given period of time and apply a simple “rule of three” or “ratio” finding what portion for that
particular client corresponds to that invoice line. That will lead you to the finally calculating the allocated WFE
value. As long as the underlying data model is understood and the scope of the data that is provided is also
understood you will progressively figure out yourself what allocation method may make the most sense. You will be
even on a position to suggest to your (possibly “Strategy and Operations”) teammates on what specific logic for
allocation would be a best fit for a given WFE. You will be able to come up yourself with allocation methods and
submit/discuss them for approval to the business experts from the client or part of the Deloitte team on the
engagement
- Final Allocation: Just multiply weighted (Per route effectiveness) and volume of sales allocation factor per Liter
of beverage sold Sold times net volume of liters sold on each invoice line
Once all the details determining how (ERP End to End process/module) and from where to extract the relevant data
needed to allocate each and every cost-related WFE, you would be now in a position to execute (Delegate/supervise
if necessary) the process of writing up Functional/Technical specs around automating Cost allocation. Moreover,
you will be writing specs around the entire process of periodically populating your Pricing/Profitability solution
Price-MART ensuring the Inbound Arm of the Integration workstream materializes itself and comes to life.
Nonetheless, and as expected one more time, not all will be totally in your hands. You will find that a number of
possibilities may present itself as far as how to specifically (and technically) implement the entire Inbound Arm
automation process. In general, you will need to be engaged in conversations with your ERP side expert counterparts
and will need to learn, as much as possible and on the heat of the day to day work, details on the diverse “tools” or
“mechanisms” that may be put in place to automate pre-polulation, allocation, final build up and extraction onto
your Pricing solution Price-MART sub-tasks.
Typically, options will range from using tools as they come “out of the box” (OOTBX) on the ERP side plus your
selected middleware tool(s) on your Integration environment, to a combination of usage of OOTBX tools and
writing some custom code to having to fully write custom code to automate the entire process. Here is an example of
a “configuration” we came up with on our last client while integrating Vendavo ® with SAP ®:
- Mapping “source” SAP modules or Tools with WFE’s:
WFE Group/Other Data
Elements SAP Modules/Tools
Price Elements On Invoice (base
Price, discounts, adjustments, etc) Sales and Distribution (SD) in ECC, MDM
Fixed/Variable COGS COPA © module (*1)
Transportation Costs Sales and Distribution (SD) in ECC, SAP FI, SAP CO
Incentives Costs Sales and Distribution (SD) in ECC
Merchandising Costs SAP Inventory Management in ECC, SAP FI, SAP CO
Equipment Costs SAP Inventory Management in ECC, SAP FI, SAP CO
Payables-related Costs SAP FI, Sales and Distribution (SD) in ECC
Virtual Margin COPA © module
Supporting Master Data MDM
We ended up doing the following:
• Writing custom ABAP code to extract directly available on-invoice data WFE’s and pre-populate an
“imaged” Price-MART (Z) temporary table in SAP with all line-based transactional data for a given period.
The reason we did this was because off-shore ABAP programming resources were more readily available
and some past project history forced us not to consider extracting direct data elements using BI ® (SAP
Business Intelligence) extractors. The client agreed also with going the ABAP route because it felt more
secure maintaining ABAP code in the future than finding BI ® experts to help them at the phase of the
project we were in at the moment
• Extracting Rebates-related, COGS, Marketing costs, Sales/Commissions and Virtual Margin WFE’s from
COPA but expressed in aggregated values by Customer/Product as it became applicable. We were not able
to obtain directly from COPA allocated costs to the single (outmost) invoice line level our
Pricing/Profitability solution required them. Then we wrote custom ABAP code to perform the final level
of allocation that was required once received 1st-step allocated costs from COPA on SAP temporary tables.
• Writing ABAP custom code to allocate the rest of the WFE’s. We modularized the work grouping
together ABAP Functional (For us Technical, see next Section for more details on this topic) specification
documents leading to ABAP programs which would allocate costs to populate WFE’s needing input data
coming from common data sources/SAP modules. That is, for example, if “Merchandinsing Costs” and
“Promotional Items” both took assest data coming from the “Assets/Inventory” SAP module-tables we
created a Functional specification for an ABAP program allocating costs for both WFE’s. We did the same
for both “Bad Debt” and “Payment Term” costs WFE’s as both needed input data available for
“Collections” functionality in the FI module. We also judged modularization based on complexity, that is,
our Transportation Costs allocation process deserved one ABAP program for itself for example. We finally
considered ease of maintenance and read-ability of code as fundamental factors in terms of how to
modularize code.
• Our main ABAP module, that is, our “MAIN” program, would periodically assemble and extract a
(Comma Separated Values (CSV) file as required by our Pricing Solution) to populate its Price-MART and
thereby allowing Pricing and Profitability analysis to be done by our customer
(*1): COPA stands for an SAP “Controlling Profitability Analysis” module. The very first thing to do whenever
hearing this word in the context of a project is to double check whether the client organization has purchased
license(s) to use this module or not. It may make no sense to even bother in knowing exactly what COPA could
“Out of the Box” do for you. This would be a rule of thumb applicable regardless of the specific packaged ERP
platform in question. COPA is a tool that performs allocation but its performance, including a potential impact
propagated to the performance of ECC as a whole, may suffer if you would like to use the tool within SAP to
allocate most of the revenue (invoice)-related and cost-related WFE’s at the invoice line level as required.
Another variable to take into consideration is the moment in which you are arriving to the game overall. This is, if
you start on the engagement from the beginning and have the opportunity to be in the loop of at least key ERP
module-specific design work sessions you can ensure design considerations in the bigger (ERP) side of things take
into account integration with a Pricing/Profitability solution. This will greatly save you time when you arrive to the
mapping challenge phase described previously. If that is not the case, which is what we would recommend to
assume in order to even prepare for what could be most challenging, you will be negotiating adjustments to be done
on the ERP side when/if needed, or, to the very least, validating and co-validating designs on both ends in parallel.
All of these factors will influence final decisions in terms of how to implement, when to use OOTBX tools/features
and when/in which language/with which team/resources to develop custom code.
Tip: “Technically feasible” and “Practically feasible” are two very different things. You may have found out
that through the use of a great packaged tool (Like COPA or even in the case of SAP another tool within BI ® that
even helps you doing allocations) you may not be re-inventing the wheel, however, if no resources have being
budgeted as part of the ERP side (whether on a brand new implementation of modules or just simply as part of the
current support team) project planning to do the necessary configuration work for you you may end up needing to
just write custom code to perform allocations. Custom code on a language like ABAP © in the case of SAP,
PL/SQL in the case of Oracle or it could be even C Language, C++, XML or any other 3rd or 4th generation core
programming language. All depending on what would be available and what would be, most importantly, feasible in
time. You may very well encounter a situation where, because you are proposing work to integrate your Pricing
Solution with the ERP platform once some of the newly implemented ERP modules configuration work is being
already completed, you will find some aprehention/fear to “touch” again what’s being already done and have to
regress-test it.
Modularizing Your Inbound Automated Solution and Minding/Developing Your Functional And
Technical Specifications
Just for illustration purposes, Figure (16) above shows in a graphical manner a practical example of how to go about
a code design/architecture modularizing the different software components which will automate your Inbound
Arm. It is also shown how this graph was used to report progress/status of the overall implementation process
highlighting where challenges/dependencies were one way or the other impacting work in progress.
The reasoning behind how and why modularization tool place on this particular application of all these techniques
was provided on the prior Section of this contribution. Eseentially, this is the modularization sequence:
• On the lower part of the graph a representation of the Pricing solution Price-MART is provided. (12)
ABAP modules were developed, all working in synchronization/sequence to end up populating the Price-
MART with data formatted exactly as needed to be loaded on your Pricing/Profitability solution
• The first (4) modules (top left area of the graph above) took care of extracting Master Data components
from SAP MDM module (Product, Customer, Sales Organization and Miscellaneous (seed) Master Data)
• One (1) MAIN ABAP module initialized a Z (Temporary) table in SAP mirroring our Pricing Solution
Price-MART with one record per invoice transaction line generated over the sales period in question. All
the records will be ready so their corresponding WFE columns would be populated sequencially as the rest
of ABAP modules were called from our MAIN ABAP module
• Other (subsequent) (6) modules would allocate the rest of the NON-COPA related WFE’s. All allocations
occurred at the outmost lower/granular (invoice line) level as stressed many times before
• One final ABAP module would do (2) things:
Tip: Overcommunicate to your Project Management and Leadership if needed in clarifying as many times as
needed what is “Functional” and what is “Technical” throughout the entire methodological approach applied. Do
this at every step of the way. You will greatly save time and psicological energy all along and free time and stamina
to move on getting the interfaces tested and rolled out to your client satisfaction
Your Impressive Yet Understandable Diagram To Show the Results of Your Work
Once you have reached this stage, that is, you are not figuring out how to go about the challenge of the Integration
job but actually implementing it it will be time to:
• Help your client to draw a nice diagram on how will things look like, all tightly integrated
• Most importantly: Let the client to take credit for the job and appreciate their time in facilitating your
daily work to arrive to this point
• Prepare the corresponding client counterpart so she/he can explain the diagram thoroughly to the
Project Sponsors and Leaders
An example diagram is shown below on Figure (17). The actual final designed architecture will so much greatly
vary depending on your client’s industry, current or new (proposed) ERP platform, technological platform,
Information Technology resources among many other factors. But the overall “Inbound”/”Outbound” way of
conceptualizing processes will be the underlying know-how strength that will carry over from project to project and
client to client to be successful at a job of this nature.
Figure (17): An Integration Work Architectural Diagram Example
To think holistically is the very essence of integrating anything. This is a skill that can only be developed with time
but with clarity on knowing when to get inside the details of a task/component, when to step out and see the forest as
well and, in some cases, when to step aside and let others be the experts in areas you are not responsible for but yet
you need to contribute as it becomes possible to their successful design/implementation/roll-out. This is because
your job is “the whole” working, not only the “individual” parts working, however, either the individual parts are
working or you have not integrated anything either.
Tip: Worry about the details when there is no other alternative other than doing so. Otherwise, apply a black-
box type of thinking approach/mindset when going about integrating software components. This is, think in terms of
what are the inputs and outputs needed/to be producted by a given task, process or sub-process. That regardless of
how those inputs are provided or produced at the end by such black-box. If you realize you need to get inside the
black box and turn on the light think that someone else, for sure, will be in a better position to provide you with
those details and devote time and effort in figuring out who that person would be among your bigger team. Spend
time in figuring black-box details yourself if you are left with no other option.
Figure (18)-(19): Overall Typical Testing Strategy Timeline and Milestones Template/Phases Definition/Approach
Testing Approach
Testing activities for the Pricing -ERP Outbound and
divided into the following test cycles (Estimate
T
– Outbound and Inbound Interface software
• Duration: <TBD> (4 weeks)
• Objective : Assumes unitary testing test case scenario scri
the corresponding business counterparts. Includes validatin
exported Price Policy data ready to be loaded on the ERP s
testing via stub calls, individual dummy data generation up
Week
Arm software components (Loading and initialization, alloca
Startin
This means, both Price Policy table data loaded on the ERP
Copyright © 2009 Deloitte Development LLC. All rights reserved.
Page 39
initialized/allocated WFE data loaded and properly provid
stakeholders on the Pricing solution side
A KX Contribution: Integrating a Pricing/Profitability Solution with an ERP Platform – December 2009
Tip: Allow prudent time (2/3 weeks roughly, this is just an estimate, estimate individually and as a function of
complexity how many days (one) person FTE would take to write one test case scenario and estimate accordingly
based on these assumptions) to write unit and integration test case scenario scripts on boh Arms of the Integration
workstream. This process may be more challenging that it may appear at first glance. As you will not be an expert
on how data processing occurs on the ERP side (black box for you) you may need to seek support and help on some
of your ERP side counterparts in order to write down test case scenario scripts (Both from an unitary and integration
perspectives). It will probably help to locate test case scenario scripts developed on each corresponding data
processing/module on the ERP side to read them and illuminate ideas on how to write your test case scenario scripts.
For instance, while writing test case scenario scripts on Transportation Costs allocation processing, it may help to
review test case scenarios written on data processing related to determine # of visits per route, data processing on
distribution of core products to Agencies, etc. The same tip applies on testing of the Outbound Arm as well.
- On The Outbound Arm: This process will entail the full end-to-end cycle of Outbound Arm data from your
Pricing Solution onto the ERP-side ERP tables. You will follow these standard steps:
• Run event(s) that generate CSV/TXT Price Policy exported data
• Run events on your middleware platform to ensure Price Policy data is properly and successfully
loaded on the ERP side Price Master Data tables
• Run test case scenario scripts on the ERP side generating invoices and ensuring Price Policy data
is displayed, processed and enforced properly on invoices
- On The Inbound Arm: As at this point you will have already unit-tested all the Inbound Arm components, you
will be running now end-to-end test case scenario scripts which will start with the preparation and execution of WFE
filling all the way towards loading and visualizing transactional data used on Price/Profitability analysis on the
Pricing Solution end.
It is important to highlight that you will need to incrementally regress-test test case scenarios as well. This is, you
may have focused on Transportation or Bad/Debt cost allocations on one case, however, you need to re-certify other
allocation procedures will work properly when ran consecutively. Give also importance to consider “border” cases
part of your test case scenario script preparation work. For example, while testing allocation algorithms on a given
WFE, ensure transactional data for days (1) and (7) over a week, specifically, has being part of the allocation
execution.
See Appendix (3) for a sample cut-over document applied on a CIP/Beverage Industry application example used
throught this contribution paper as an illustration
Once cut-over activities have being completed and your Integrated Pricing solution is live, your focus will be, for
at least (30) days, to support the solution weekly and ensure your first month of transactional data initializes,
allocates and loads properly. As you will be allocating costs based on cumulative closed costs of the prior month,
your first week may or may note offer clues of certification of proper execution of allocations right immediately. If a
brand new ERP solution is being also deployed, you will most probably be using legacy cost data to allocate. If your
client will only count with newly generated transactional data over the new ERP system to allocate you may
probably need to stay around monitoring the weekly load process longer.
On our most recent experience, as the roll-out of an new (SAP) ERP systems was to be completed on scattered
phases, our Pricing Solution also interfaced with legacy (ERP) data so Price Policy data was designed to be
interfaced also on a scattered basis. Only Price Policy data corresponding to the Regions being rolled-out from the
ERP perspective was to be interfaced incrementally until all Regions were rolled-out. This need entailed increased
complexity on the implementation of both arms, as Master Data unification work was also needed until the new
SAP solution was rolled-out in its entirety. Two (2) allocation-related sets of code were implemented, one allocating
legacy data (Over different allocation methods upon data sources available on the legacy systems) and the other one
allocating data in SAP. This part of the overall client reality will obviously vary greatly project to project.
Section 4.4: Tips, Recommendations And Lessons Learned (Do’s and Don’ts)
Prior experience executing on Pricing ERP Integration workstream activities allows us to list the following
Do’s and Don’t summarizing on Tips developed in detail in prior sections:
• Do cultivate relationships with ERP-side counterparts so the process of gathering existing information and
discovering data sources becomes an effective and productive one
• Do rely on existing knowledge and existing work. Spend time figuring out how to leverage existing work
once discovered, learned and understood
• Do apply the “black box” approach when digging onto details to be investigated/understood as part of
daily activities (See Tips elaborated previously)
• Do analyze carefully recommendations coming from on-site Pricing and ERP-side manufacturer expert
advise. In general, manufacturer’s side counterparts will not dig for you on knowledge on a side not being
their domain expertise (For example, a Vendavo ® expert will not help you a lot on answering questions
coming from the SAP side or viceversa)
• Do allow enough time/provide conservative estimates on test case scenario design and documentation work
• Do prepare relentlessly material to be shown to any counterparts you would expect to help you on your
information discovery activities
• Do anticipate viscosity in the process of gathering information. Do not assume documentation will be
readily available and that the process of gathering will not have significant hurdles and difficulties
• Do not take personally possible comments trying to push responsibilities onto your side on incomplete
work in terms of design/implementation of components on the ERP side
• Do not attempt to embrace all contents and knowledge quickly. Rely on an incremental understanding of
all variables and knowledge acquisition action-items throughout the entire journey
• Do not assume help will be available in terms of relying on ERP counterparts to spend time in helping you.
Be ready to escalate requests to Management counterparts and handle with much care the “political” factors
(Personal agendas and right tone while communicating) related to all counterparts involved
Communicating often will be key. Especially with your immediate Project Lead (Manager/Senior Manager). The
level of uncertainty that will characterize this type of work will be higher because you will be depending also on the
work others will be doing and not only on the one you will be able to manage and supervise. Hence, communicating
often, on the right tone and with the right counterparts depending on the issue/topic at hand will be the crucial factor
of success when engaged on a challenge like integrating applications.
It will be probable to be assigned onto a Pricing Solution ERP Integration workstream not from the beginning
of the corresponding project but a bit later with original deadlines still being set as at the project initial planning
phase. If that is the case, please read this contribution thoroughly so you can be aware and one step ahead of the
various complications and challenges that will appear. Consider normal the following situations among many others
if you “land” and get assigned this type of work:
• To be pressured to deliver results very quickly and to be expected to answer many questions that perhaps
should have being answered several weeks/months ago
• To answer questions involving details of the current ERP side module-specific design considerations you
will not be fully aware of
• To be told, perhaps frequently, “time and resources to collaborate with you will need to be made feasible
because those were not totally contemplated from the very beginning”
• To have to persist in the search for answers, whatever those answers are. You will be politically “turned-
down” frequently and giving up will not be an option
• To have to establish relationships with counterparts on the Management layer at the ERP side (Senior
Managers or even Partners), so you can escalate to them, politically and with care, requests you may have
to collaborate with you when deadlines are closely on top of you
• To be praised though when you help in “discovering” design “flaws” on the ERP side
• To have to comply with change-management processes on the ERP side when modifications on that side
will be needed before parts of the Integration workstream can be fully spec-out. You will need to comply
with a lot of documentation standards that may appear to slow your work but fighting them do no good to
yourself
If you inherit this type of responsibility in circunmstances like this, the first month of work will be a very tough one,
especially if you do not count with some sort of passed-through experience document such as what we have
attempted to accomplish with a contribution like this. You will need a lot of serenity and patience at the beginning.
But you will need to press-on with no pause, day in and day out. In addition to that, you will most probably have to
gain rapport with your Management within Deloitte first, especially if your Management (Manager or Senior
Manager leading the initiative) just met you and you were assigned to the project based on your prior experience and
skills. The first assignment will be a difficult one but once you pass the first month and show results (Do focus on
providing results as quick as possible) knowledge will fall into place and you will know better what to expect and
how to react appropriately. Then future assignments will have a much shorter learning curve and will hopefully be
even more successful and productive.