Вы находитесь на странице: 1из 45

Guide for IntegratingSystemsEngineering into DoDAcquisitionContracts

Version1.0

December11,2006
Department of Defense

CONTENTS PREFACE .............................................................................................................................. iii


1 ACQUISITIONPROCESS .............................................................................................1
1.1 1.2 2 ContractingProcess.......................................................................................................2
Important ContractConsiderations AffectingSystems Engineering ...............................4

SYSTEMSENGINEERINGINACQUISITIONPLANNING......................................6
2.1 Technical Approach andtheSystems EngineeringPlan (SEP).......................................7
2.2 System Requirements ....................................................................................................8
2.3 Systems Engineeringin theStatement of Objectives (SOO) ..........................................9
2.4 Technical IncentiveStrategies .....................................................................................11
2.5 Government andIndustry Interaction ..........................................................................11
2.5.1 Market Research ................................................................................................12
2.5.2 Industry Days.....................................................................................................12
2.5.3 Draft Request for Proposal (RFP).......................................................................13
2.6 Technical Planningin theSource Selection Plan (SSP)................................................14

REQUESTFORPROPOSALANDSOURCE SELECTION .....................................16


3.1 Sections C andJ of theRFP........................................................................................16
3.2 Sections M andL of theRFP.......................................................................................16
3.2.1 Technical Factor Evaluation...............................................................................18
3.2.2 Management Factor Evaluation ..........................................................................21
3.2.3 Past Performance Factor Evaluation ...................................................................26
3.2.4 Proposal Evaluation ...........................................................................................27
3.2.5 CostFactor Evaluation .......................................................................................27
3.2.6 Proposal RiskAssessment Evaluation ................................................................28

CONTRACT EXECUTION ..........................................................................................29

APPENDIXA. DEVELOPMENT OF SEINPUT TOSECTIONSM, L, AND PROPOSAL


EVALUATION .......................................................................................................................32
APPENDIXB. APPLICABLE REFERENCES....................................................................36
APPENDIXC. ABBREVIATIONSANDACRONYMS......................................................38
List of Figures Figure11 SimplifiedGovernment Acquisition Process .............................................................1
Figure21 RelatingAcquisition Program Elements toRFPandTechnical Attributes .................7
Figure22 Key Technical Relationships .....................................................................................9
Figure23 Typical Source Selection Organization andFactors .................................................15
Figure41 Establishingan IntegratedProgram SEP .................................................................30

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

List of Tables Table11 Summary of ContractingActivities andSE andPMRoles ..........................................2


Table21 SampleSE Items for aSOOduringtheSDDPhase ..................................................10
Table22 ExampleTechnical Topics for Industry Days ...........................................................13
Table31 SampleQuestions for DevelopingSpecificSE RelatedCriteriaand Instructions for
Sections M andL..................................................................................................................17
Table32 SampleEvaluation Criteriafor Technical Solution and Technical SupportingData .19
Table33 SampleProposal Content Requirement for Technical Solution and Technical
SupportingData....................................................................................................................19
Table34 SampleEvaluation Criteriafor System Performance Specification ...........................20
Table35 SampleProposal Content for System Performance Specification ..............................20
Table36 SampleIntegratedEvaluation Factors for Technical Management ............................22
Table37 SampleTechnical Proposal Content for SOW...........................................................23
Table38 SampleTechnical Proposal Content for SEP ............................................................24
Table39 SampleTechnical Proposal Content for IMP/IMS ....................................................25
Table310 SampleProposal Content for IMPNarratives .........................................................25
Table311 SampleTechnical Proposal Content for OtherManagement Criteria.......................26
Table41 Systems EngineeringTasks duringPostAwardConference......................................29
Table42 Technical Tasks duringtheIBR................................................................................29
Table43 EstablishingtheIntegrated Program SEP..................................................................30

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

ii

PREFACE This Guidefor IntegratingSystems EngineeringintoDoDAcquisition Contracts supports theimplementation of systems engineering(SE) policy initiatives by theUnderSecretary of Defensefor Acquisition, Technology andLogistics (USD(AT&L)) statingthattheapplication of rigorous system engineeringdisciplineis paramount to theDepartments ability to meet the challengeof developingandmaintainingneededwarfightingcapability. Primary references includethefollowingUSD(AT&L) memoranda: Policy for Systems Engineeringin DoD, USD(AT&L), 20February 2004, Policy Addendum for Systems Engineering, USD(AT&L), 22October2004 ImplementingSystems EngineeringPlans in DoDInterim Guidance, USD(AT&L), 30March 2004). Thetarget audience for this guideis theGovernmentprogram team responsiblefor (1) incorporatingprogram technical strategy andtechnical planninginto theRequest for Proposal (RFP) and(2) performingpreawardfunctions, includingsource selection, as well as postaward contractor execution activities. Theguideis of most usetotheProgram Manager(PM), theLead (or Chief) Systems Engineer (LSE), theContractingOfficer (CO), andthesolicitation team. Theprimary purposeof this guideis toaidthePMandLSE toeffectively integrateSE requirements into appropriatecontractingelements in support of system acquisition however, all Governmentandindustry personnel involvedin aprogram can benefit from this guide. The authors presumethereaderis familiarwith Department of Defense(DoD)governingacquisition directiveandinstruction (DoDD5000.1, USD(AT&L), May 12, 2003, andDoDI 5000.2, USD(AT&L), May 12, 2003) and the DefenseAcquisitionGuidebook (DAG). For example,see DAGChapter2, DefenseAcquisition Program Goals andStrategy. Theguidealsoaids theCO in understandingaprograms needfor goodSE requirements as part of any systems acquisition effort. Theguidefocuses on thecommon competitivetypecontract, both fixedprice andcost reimbursable(see Federal Acquisition Regulations (FAR Part 16)), applyingtheRFP(FAR15 .203) approach however, users may beableto tailor thetechnical aspects to supportother acquisition approaches, such as solesource purchaseof commercialofftheshelf (COTS) products (e.g., software)information technology (IT) business systems or services andothers. TheCOis responsiblefor all contractingaspects, includingdeterminingwhich typeof contractis most appropriate. Nothingin this guideshouldbeconstruedtochangeor addtothe requirements of existingregulations, directives, instructions, andpolicy memos. This guideapplies toall phases of theacquisition lifecycle(see DoD LifeCycle AcquisitionFramework). For simplicity, however, it focuses on preparingfor theimportant System Development andDemonstration (SDD) lifecyclephase(i.e., postMilestoneB). The content of theguidecan beadaptedandtailoredfor programs enteringotheracquisition life cyclephases (see DAGChapter4, Systems EngineeringandDoDI 5000.2 for moredetails). Furthermore,although this guidedoes notreiterate technical SE information foundin the DAGor in theSystems EngineeringPlan (SEP) Preparation Guide, ituses theDAG(andSEP guidance) as abasis for guidance noted hereas particularly important. This guidedoes not elaborateon specialty engineeringrequirements (e.g., Logistics/Sustainment(includingMaterial

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

iii

Readiness (MR) seeDAGChapter5, LifeCycleLogistics) Test andEvaluation (T&E) (see DAGChapter9, IntegratedTestandEvaluation)ModelingandSimulation (M&S) Information Assurance (IA) andArchitecture(see DAGChapter7, AcquiringInformation Assurance and National Security). Each program is uniqueandneeds to considerwhich technical requirements andsolicitation evaluation criteriaareimportanttoincludein theRFP. This guide(1) includes brief information on theacquisition andcontractingprocess focusedon technical planningandsubsequentexecution, (2) provides examples of technical inputs neededfor thesolicitation andsource selection, and(3) suggests activities immediately followingcontractawardtoassisttransition intotheSDDphase. Theexamples andsuggestions in this guideshouldbeconsideredsubject to thedirection of theCO, Source Selection Authority, or otherhighermanagement. A key technical objectivefor theprogram duringthis preMilestoneB activity is to providethepotential offerors information describingtheGovernments program technical approach as reflectedin theGovernmentdevelopedSEP (DAG4.5.1). Alsoprovided, as available, wouldbetheIntegrated MasterPlan (IMP DAG4.5.2) andIntegrated Master Schedule(IMS DAG4.5.3) toform abaselinefor theofferors to respondto theRFP. Although theDoDI 5000.2 does not requireapproval of theSEPuntil MilestoneB, this guide stresses theimportance of early technical planning(andassociateddocumentation in theSEP) so thattheGovernments technical strategy can bereflectedin theRFP. Otherkey artifacts with technical content(e.g., Initial Capabilities Document (ICD), Analysis of Alternatives (AoA) results, Capabilities Development Document (CDD), TestEvaluation Strategy (TES) and/or Test andEvaluation MasterPlan (TEMP), preliminarySystem Performance Specifications (SPS)) may alsobeprovidedto offerors, if availableandconsideredappropriate. Theguideincludes links andreferences to procurementregulations andgeneral acquisition guidance toassist thereaderin securingmoredetailedinformation. Theguideis not allinclusivebut is meant to giveprogram offices a startingpointfor ensuringthat contracts incorporateSE as acritical elementin any system acquisition. Theauthors havetriedtoavoid references tospecificserviceor agency policies, directives, andguidance, as each organization will needtoconsidertheapproach. TheOffice of theSecretary of Defense(OSD) officeof primary responsibility (OPR)for this guideis DUSD(A&T), Systems andSoftwareEngineering/EnterpriseDevelopment (SSE/ED). This office will developandcoordinateupdates totheguideas required, basedon policy changes andcustomerfeedback. Toprovidefeedbackto theOPR, pleaseemail the officeatATLED@osd.mil.

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

iv

ACQUISITIONPROCESS

This guidefocuses on themajor technical elements of theGovernmentacquisition process as definedinDoDD5000.1, TheDefenseAcquisition System andDoDI5000.2, Operation of theDefenseAcquisition System. Figure11 is asimplifiedillustration of DoDs acquisition process with thecritical component of contracting. Itbegins when thewarfighter identifies theneed(see Joint Capabilities Integration andDevelopment System (JCIDS) 3170.01E) to theacquisition activity whothen translates thatneedintoan actionablerequirement andpurchaserequest. Thecontractingofficer(CO)solicits offers from industry andawards a contract. In thefinal step,thecontractor closes theloopby deliveringproducts andservices that satisfy theGovernmentneed. Duringacquisition planning, primary responsibility rests with the acquisition activity.

Warfighter Warfighter

St e io p 1 n Pl an n

Step4 DelivertoWarfighter
ed s

in

pa bi

liti

es

Ac q

Step 3 ContractPerformance

sit

Ne

ui

Ca

AcquisitionActivity AcquisitionActivity
ProgramManager Program Manager SystemsEngineer SystemsEngineer Contracting Officer Contracting Officer

ProductandServices Accepted

Contractor Contractor
SolicitationandContract

Step 2 ContractFormation

Figure11 SimplifiedGovernment Acquisition Process Acquisition planningis theprocess of identifyinganddescribingneeds/capabilities/ requirements anddeterminingthebest methodfor meetingthoserequirements (e.g., business, program Acquisition Strategy), includingsolicitations/contracting. Acquisition planningfocuses on thebusiness andtechnical management approaches designedto achievetheprograms objectives within specifiedresource constraints. TheAcquisition Strategy, usually developedin theTechnology Development phaseof acquisition, is approvedby theMilestoneDecision Authority andprovides theintegrated strategy for all aspects of theacquisition program throughouttheprogram lifecycle. TheSystems EngineeringPlan (SEP) (SEPPreparation Guide ) documents theprograms system engineeringstrategy andis theblueprint for theconduct, management, andcontrol of thetechnical aspects of theacquisition program. TheAcquisition Plan provides morespecificplans for conductingtheacquisition andis approvedin accordance with agency procedures (FAR Part7). A Source Selection Plan specifies thesource selection organization, evaluation criteria,andprocedures, andis approvedby theContractingOfficer (CO) or otherSource Selection Authority (SSA). All of thesedocuments guidethedevelopment of theRequest for Proposal (RFP).
ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

Itis important thattheprogram team havestrongtechnical andcontractingleadershipas theprogram moves through its steps in contract formulation andexecution. Itis imperativeto have theCOinvolvedintheprogramacquisitionplanningprocess as earlyas possible. The Acquisition Community Connection (ACC) Practice CenterWeb siteis akey source for policy andguidance. Othercompanion program artifacts include,for example,theCapabilities Development Document (CDD), Technology Readiness Assessment(TRA), Information Support Plan (ISP), TestandEvaluation MasterPlan (TEMP), ProductSupportStrategy (PSS), Support andMaintenance Requirements. 1.1 ContractingProcess

Theprogram manager(PM), chief or leadsystems engineer (LSE), andaCOmust work togethertotranslate theprograms Acquisition Strategy andAcquisition Plan andassociated technical approach (as definedin theGovernmentSEP) into a cohesive,executablecontract(s), as appropriate. Table11shows somekey contractingrelatedtasks with indicators of roles of thePM andLSE. Table11 Summaryof ContractingActivities and SEandPMRoles Typical ContractRelatedActivities System Engineer and PMRoles Identify overall procurement requirements and LeadSE (LSE) provides program technical associatedbudget. DescribetheGovernments requirements. PMprovides any needs andany constraints on theprocurement. programmaticrelatedrequirements. Identify technical actions required tosuccessfully LSE defines thetechnical complete technical and procurementmilestones. strategy/approach andrequiredtechnical The programs SEP is the key source for efforts. This will beconsistent with the capturing this technical planning. programs Acquisition Strategy and Acquisition Plan within theDoDI5000.2 requirements. Document market research results andidentify PMandLSE identify programmaticand potential industry sources. SeeFAR Part 10 for technical information neededandassists in sources of market research andprocedures. evaluatingtheresults. Small Business mustbeconsidered. Prepare a Purchase Request, including product PMandLSE ensurethespecific descriptions Priorities, Allocations, and programmaticandtechnical needs are Allotments architecture Governmentfurnished definedclearly (e.g., commercialoffthe property or equipment (or Governmentoffthe shelf (COTS) products). shelf (GOTS) Governmentfurnished information information assurance and security considerations and required delivery schedules. Identify acquisition streamliningapproach and Theprocurement team worktogether, but requirements, budgetingandfunding, theCOhas primeresponsibility for FAR contractor vs. Government performance, andtheDefenseFAR Supplement management information requirements, (DFARS) requirements. ThePMis owner environmental considerations, offeror expected of theprogram Acquisition Strategy. The skill sets, andmilestones. Theseareaddressed LSE develops andreviews (andPM
ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

1.

2.

3.

4.

5.

Typical ContractRelatedActivities in theAcquisition Strategy or Acquisition Plan. 6. Plan therequirements for thecontract Statementof Objectives (SOO) /Statementof Work(SOW) /specification, project technical reviews, acceptancerequirements, and schedule. 7. Plan andconduct Industry Days as appropriate.

System Engineer and PMRoles approves) thetechnical strategy. LSE is responsiblefor thedevelopment of thetechnical aspects of theSOO/SOW. SeeFAR Part11.

PMandLSE supports theCOin planning themeetingagendato ensuretechnical needs arediscussed. 8. Establish contractcost, schedule,and LSE provides technical resource estimates. performancereportingrequirements. LSE supports development of theWork Determinean incentivestrategy andappropriate Breakdown Structure(WBS) structure mechanism (e.g., AwardFee Plan andcriteria). basedon preliminary system specifications determines eventdriven criteriafor key technical reviews and determines whattechnical artifacts are baselined. ThePMandLSE advisetheCO in developingthemetrics/criteriafor an incentivemechanism. 9. Identify datarequirements LSE identifies all technical Contractor DataRequirements List (CDRL) and technical performance expectations. 10. Establish warranty requirements, if applicable. LSE works with theCOon determining costeffectivewarranty requirements. 11. PrepareaSource Selection Plan (SSP) and PMandLSE provideinput to theSSP per RFP(for competitivecontracts). theSOO/SOW, Sections L (Instructions for Offeror) andM (Evaluation Factors) of the RFP. 12. Conduct source selection andawardthe PMandLSE participateon evaluation contractto thesuccessful offeror. teams. 13. Implement requirements for contract PMandLSE provideinput regardingthe administration officememorandum of programmaticandtechnical support efforts agreement (MOA) and/or letterof delegation. tobeincludedin theMOA and/or letter of TheMOA shoulddefineperformance delegation. [PMmay seekDCMA requirements/attributes. support]. 14. Monitor andcontrol (M&C) contractexecution PM,LSE andprogram team perform for compliance with all requirements. programmaticandtechnical M&C functions as definedin thecontract. They alsoassist theEarnedValueManagement (EVM) implementation by definingthe criteriafor completion of technical activity/deliveredproducts.

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

Typical ContractRelatedActivities 15. Contract Closeout

System Engineer and PMRoles This is mostly accounting/administration, but COprovides status to PM.

1.2

Important ContractConsiderationsAffectingSystems Engineering

Thefollowingcontractingaspects may affecttheprograms SE efforts andproducts and shouldbeconsideredin solicitations: Organizational Conflictof Interest TheGovernment acquisition contractingteam needs to avoidany organizational conflict of interest (OCI) in SE andtechnical direction work to beperformedby apotential contractor. A potential OCI exists when, becauseof acontractors otheractivities, thecontractor may enjoy an unfair competitiveadvantage, or when awardof thesubject contractcouldput thecontractor in theposition of performingconflictingroles that might bias thecontractors judgment(see FAR9.5). TheCOis responsiblefor usingthegeneral rules, procedures, andexamples in theFAR toidentify andevaluatepotential OCIs as early in theacquisition process as possibleandto avoid, neutralize, or mitigatesignificant potential conflicts beforecontract award. From theprograms point of view,they must beawarethat any current or previous involvement of contractors or consultants in aspects of this, or a related, program may precludethe opportunity of respondingto theRFPbeingprepared. High standards of ethics andprofessionalism areexpected of every participant in thesource selection process. Any questions or concerns about procurement integrity or standards of conduct shouldbebrought totheagency ethics official or theCO. Commercial Item Acquisition Market research will determineif commercial items (e.g.,COTS) or nondevelopmental items areavailableandmay meet certain technical requirements of theprogram. TheSE (i.e., includes LSE andother technical staff logistics, T&E, IA, etc.) team plays akey rolein supportingthe market research efforts, analyzingtechnical attributes andassociatedcosts relatedto benefits andrisks of various such options. Generally, however, theGovernments requirements shouldbedescribedin terms of performance requirements. This is usually part of theAcquisition Strategy. It shouldbeleft totheofferors (in their proposals) andcontractor, in design documents deliveredto theGovernment for approval, todescribetheplanneduse of commercial items. IncentiveContracts Thereareseveral types of incentives, such as awardfees, to motivatecontractors toexcel inperformance, andreducerisks totheGovernment. TheCOhas ultimateresponsibility for determiningcontract typeandincentives (see IncentiveStrategies for DefenseAcquisitions Guide). If an awardfee typecontract will beused, thePM andLSE will assist theCOtodevelopan AwardFeePlan. The awardfeegenerally shouldbeassociatedwith successful completion of discrete events, such as technical reviews, that demonstrate progress towardsuccessfully completingcontract requirements. Otherawardfeecriteriamay includekey system performanceparameters andthecontractors costand/or scheduleperformance. Consideration shouldbegiven tousingexistingperformance metrics, such as the

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

contractors EarnedValueManagementSystem (EVMS) andotherSE andPMtools (see Section 2.4for morediscussion).

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

SYSTEMSENGINEERINGINACQUISITIONPLANNING

Systems engineering(SE) is an overarchingprocess thattheprogram team applies to transition from astatedcapability needto an affordable, operationally effectiveandsuitable system (DAGChapter4, Systems Engineering). A brief overviewof SE is providedhereto set thestage for showinghowit becomes acritical aspect of acquisition contracts. SE encompasses theapplication of SE processes across theacquisition lifecycleandis intendedtobean integratingmechanism for balancedsolutions addressingcapability needs, design considerations andconstraints, as well as limitations imposedby technology, budget, andschedule. SE is an interdisciplinary approach or astructured,disciplined, anddocumentedtechnical effortto simultaneously design anddevelopsystem products andprocesses tosatisfy theneeds of the customer(DAG4.1). Duringtheprogram acquisitionlifecycleit is critical that an early andconsistent application of SE begin at theonset of aprogram (Concept Refinement andTechnology Development(CR/TD)phases). Itis recommendedthat aprogram SE Integrated ProductTeam (SEIPT) beformedearly in theacquisition planningactivity toundertake thetechnical planning activities. A Leador Chief Systems Engineershouldchair theSEIPT, and otherSE/technical Subject Matter Experts (SMEs) areactivemembers, e.g.,T&E, M&S, Logistics/Sustainment, Software(SW), IA, security, andsafety engineering. For thoseprograms enteringdirectly into SDDphase, the technical effort begins long beforetheassociatedMilestoneBanddevelopmentof theRFP. Theprogram Acquisition Strategy, includingthetechnical approach, shouldbedocumentedin an integratedset of Governmentplans that includes theAcquisition Strategy (DAGChapter2, DefenseAcquisition Program Goals andStrategy), SEP (DAG4.5.1 andSEPPreparation Guide), Testand Evaluation Strategy (TES)/TEMP(DAGChapter9, IntegratedTest andEvaluation), ISP (DAG 7.3.6), RiskManagement Plan (RiskManagement Guidefor DoDAcquisition), Preliminary System Performance Specification (or equivalent), Program Budget, Government Roadmap and/or Top Level Program Schedule(IntegratedMasterPlan andIntegratedMaster Schedule Preparation (IMP/IMS) andUseGuide). Theseactivities will supportthedevelopment of the Acquisition Plan andSSP. Buildingon this solidfoundation, theRFPshouldreflect the Governments policy directives, program Acquisition Strategy, userrequirements to meet capability needs, and, theprograms processes, lessons learned, andsoundpractices of both Governmentandindustry (see Figure21). Regardless of thescopeandtypeof program or at what pointit enters theprogram acquisition lifecycle,thetechnical approach totheprogram needs tobeintegratedwith the Acquisition Strategy toobtain thebest program solution.

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

Reviewsand Approvals
MilestoneReview AcquisitionStrategy Reviews(Serviceand programpeculiar)

TypicalRFP* ContractSchedule (Sections AJ)


BCLINSand Prices CDescription/SOW DPackaging/Marking EInspection/Accept. FDelivery Schedule GAdminData HSpecialProvisions I_ C ontractClauses

Key ProgramTechnical Attributes


TechnicalEnterpriseProcesses Integrated approachtoengineering,test,and
logistics/sustainment
Technicalapproachaddressingtheprogramslifecycle EventbasedtechnicalreviewswithindependentSMEs SingleTechnicalAuthority IPTbasedorganizationderivedfrom WBS ContractorsCapability DomainexpertisecoupledwithEnterpriseprocesses usingexperiencedpersonnel Provenpastperformance(domainandprocessareas) TechnicalPlanning TechnicalapproachintegratedwithIMP/IMSandEVMS Viablesystemsolutionemployingmaturetechnology Specialdesignconsiderations(MOSA,IA,security, safety,etc.) TechnicalBaseline Technicalbaselinemanagement Requirementsmanagementandtraceability Productmeasureslinkedtotechnicalbaselinematurity, financial,andschedulemeasures/metrics Incentives SEexcellencethatresultsinsuperiorproduct performancebalancedwithcostandscheduleSBIR CostandScheduleRealism Realisticprogrambudgetsoptimizedprogramcost, schedule,andperformance RealistictaskandachievableschedulesintheIMP/IMS Management ofthecriticalpathandnearcriticalpaths DataAccess Ownership,control,anddeliveryoftechnicalbaseline datathatsupportthetechnicalandsupportstrategy Timelyaccesstoprogramtechnicaldata

Program Documents
AcquisitionStrategy TopLevelProgram Plan/Program Schedule PreliminarySystem Performance Specification WBS SystemsEngineering Plan(SEP) TestandEvaluation Strategy(TES)/Test andEvaluationMaster Plan(TEMP) ISP/TRA CONOPS/CDD/ AoA ICE ProgramBudget

SectionJ Attachments Top LevelProgramPlan& Schedule Preliminary System PerformanceSpecification ProgramWBS SOO orSOW SEP( as appropriate) CDRLs

SolicitationPlanning
AcquisitionPlan Incenti vePlan SourceSelectionPlan

Sections L&M L InstructionstoOfferors MEvaluationFactorsfor


Award

*SectionKis byreference

Figure21 RelatingAcquisition Program Elements toRFP and Technical Attributes 2.1 Technical Approachand the Systems EngineeringPlan(SEP)

Thetechnical approach for theprogram begins at thevery onset of aprogram andis documentedin theSEP andrelatedplans (e.g., RiskManagement Plan). Beforesource selection, theSEP reflects theGovernments technical approach totheprogram as itmoves through the CR/TD, SDD, Production andDeployment (P&D), Operations andSupport(O&S) program acquisitionlifecyclephases. As definedin theSEPPreparation Guide, theSEPis theblueprint for theconduct, management, andcontrol of thetechnical aspects of an acquisition program from conception todisposal, i.e., howtheSE process is appliedandtailoredto meet each acquisition phaseobjective. Theprocess of planning, developing, andcoordinatingSE andtechnical management forces thoughtful consideration, debate, anddecisions to produce asoundSE strategy for aprogram commensuratewith theprograms technical issues, lifecyclephase, and overall objectives TheSEPis theonedocument that defines themethods by which all system requirements havingtechnical content, technical staffingandtechnical management aretobeimplementedon aprogram, addressingthegovernment andall contractor technical efforts. [Note: Until a contractor is selected,this part will represent high level expectations, within thedefined Acquisition Strategy andPlan, of what thecontractor will perform tobeconsistent andintegrated

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

with theGovernments SEP]. A few key contractrelevantitems areextracted from theSEP Preparation Guideandreiterated here: TheSEPis about overall organization of thetechnical effort, includingdelineation of authorities, responsibilities, andintegration across thegovernment andcontractor boundaries. TheSEPshows howtheSE structureis organizedtoprovidetechnical management guidance across thegovernment, primecontractor, subcontractors, andsuppliers. TheSEPprovides an overview of government andcontractor datarights for thesystem to includewhat key technical information anddatawill bedevelopedduringthephasebeing planned. TheSEPsummarizes howtheprograms selectedAcquisition Strategy is basedon the technical understandingof theproblem at handandtheidentifiedprogram risks to includethelist of program risks. TheSEPdescribes howthecontract(andsubcontract andsuppliers, if applicable) technical efforts are to bemanagedfrom theGovernment perspective. A key requirement of offerors responses totheRFP is thesubmission of afully integrated technical management approach thatis expandedfrom theGovernment SEP toa fully integratedSEP [Note: traditionally this was documentedin aSE Management Plan (SEMP)] whichincludes theofferors technical approach, processes, procedures, tools, etc.). Also includedin theresponsewill beaContractor SOW (CSOW), an updated, expanded, and integratedContractor WBS(CWBS), which is correlatedwith theofferors EVMS(EVMS Implementation Guide), as appropriate,and theIMP/IMS. Followingthesource selection andcontract award, theSEPsevolveinto aProgram SEP (see Section 4of this guide), documentingtheGovernmentandindustry sharedview of the technical approach andplanningfor theprogram. For contractual andmanagement efficiency, therevisedGovernment SEP andcontractors integrated SEP may remain as two separate documents with appropriatelinks in an IntegratedDevelopment Environment (IDE) to ensure communication andconfiguration control across theGovernment andcontractors activities and products as work progresses andchanges areauthorized. As aprogram progresses through its lifecycle, thelevel of fidelity andareas of emphasis in theSEPwill change. It is important that theprogram team haveasinglevision of thetechnical planning(thereforetheindividual SEPs will bein alignment) andexecution when makingacommitment for thedesign, development, test, andtransition of asystem/product(s)tosatisfyusers operational, logistics, andsustainment needs. 2.2 System Requirements

Soundsystem requirements (includingperformance)arethebackboneof agood technical strategy andresultant plan (as documentedin theSEPandrelatedplans). Theperformance requirements, as aminimum, must becommensuratewith satisfyingthethresholdfor thecritical operational (includingsustainmentandsupport) requirements (e.g., Key PerformanceParameters (KPPs)) andbalancedwith program cost, schedule, andriskconstraints. If theseelements are notbalancedat thestartof theSDDphase,theprogram has ahigh probability of incurringcost increases, sufferingscheduledelays, and/or deficient performance of theendproduct. An
ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

importantelement of theprograms technical planshouldbefocusedon maturingthetechnical baselineviaeventbasedtechnical reviews andcompleteness of T&E (SEPPreparation Guide andDAGChapter9, Test andEvaluation) whilemanagingthesystematicdecomposition and allocation of therequirements down thespecification hierarchy (DAG4.3.3). Figure22 illustrates therelationships amongrequirements, technical reviews andtechnical baseline.
RequirementsManagement MaturingTechnicalBaseline

ITR ASR
SRR SFR PDR

RequirementsDecomposition,Allocation,andTraceability

System Specification

SubTierSpecifications

Functional

CDR TRR FCA/ SVR PRR OTRR PCA ISR

Allocated

ProductSpecifications

Product

*Seedefinitions in AppendixCAcronyms

Figure22 Key Technical Relationships 2.3 Systems Engineeringin theStatement of Objectives (SOO)

When theGovernmentdevelops aSOO, as opposedto aSOW, in theRFP(andin attachment J), theSOOis aclear andconcisedocumentthat delineates theprogram objectives andtheoverall approach, particularly critical (or high risk) requirements that becomepart of the trade space. TheTRA will supportthis identification. Table21contains suggested technical/SE items toconsiderincludingin aSOO. TheSOOdoes notbecomepartof thesubsequentcontract. [Note:A SOW, or a PWS, is always included].

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

Table21 SampleSEItems for aSOO duringtheSDDPhase


The programstechnical approach will capitalize on Governmentand industry standards, policies, and directiveswhile leveraging the contractorsdomain experience and enterprise processes. The technical objectivesfor the program are to: 1. Design, develop, test, and deliver a system which meetsthe performance requirements of the user when operated within the XXXSystemofSystems(SoS) (or within YYY FamilyofSystems(FoS)). 2. Use contractor enterprise processes to execute the program. Flow down policies and processesto the lowest level of the contractor (subcontractors, teammates, or vendors)team asappropriate. Employ continuousprocessimprovementactivities integrating both Governmentand contractor practicesand processes. Ensure Governmenttechnical processes, asdefined in their SEP, are integrated and consistentwith the contractor technical processes. 3. Documentthe programstechnical approach in a Program SEP (including both GovernmentSEP and the contractor integrated and expanded SEP) thatisupdated throughoutthe life of the program. 4. Implementeventbased technical reviewsthatare included in the IMP and IMS with specific entry and exitcriteria. Technical reviewsinclude the participation of independent(of the program) subjectmatter experts. 5. Establish interface managementprocesseswhich define the intersystem (SoS, FoS) interfacesand intrasystem [subsystems, CommercialofftheShelf(COTS), Governmentoffthe Shelf (GOTS), etc.]interfacesto supportsystem development. 6. Use contractor configuration management(CM)processesto control the configuration of technical baseline data and productconfigurations. Provide real time accessto technical productdata for program participants. Ensure compatibility with the GovernmentCMprocesses. 7. Enhance opportunitiesfor incorporation of advanced technology for improved performance and sustainmentusing Modular Open SystemsApproach (MOSA) principles. Encourage use of commercial products and industrywide standards recognized for high quality. 8. Use modeling, simulation, prototypes, or other meansto allow early Government assessmentof productmaturity and functional capabilitiesin supportof technical reviewsalong with optimizing systemlevel testing. 9. Include Governmentparticipation on IPTsto gain insightinto program progressand streamline the coordination and decision processes. Ensure compatibility and integration with the Governmentdefined IPTs(see GovernmentSEP). 10. Implementa comprehensive risk managementprocessthatisfocused on program risk areasand the programscritical path(s)to systematically identify and mitigate cost, schedule, and technical risks. Ensure Contractor risk managementprocessesare compatible with the Governmentriskmanagementprocess.

Theguidancefor theSOOis generally applicabletoaSOW also.

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

10

2.4

Technical IncentiveStrategies

Thedetermination anddevelopment of anincentivestrategy begins early in theprogram. Theincentivecriteriashouldreflect areas of performance for which theGovernment wants to encourage performance excellence as ariskreduction activity. A contractual incentive,such as an awardfee (referbackto section 1.2), shouldfocus on themost critical SE issues and/or practices (DAG4.2.3 and4.2.4). Two awardfee examples arepresentedfor illustration: Risk Management Incentive A contractors riskmanagement process is oneexample of an awardfee element thatcouldrecognize andrewardacontractor that strategically focuses on efficient andeffectivemanagement practices. Awardfeecriteriamay includetheextent to which theriskmanagement process employedon theacquisition program is integratedacross the government andcontractor team. Sampleindicators of an integratedprocess include: A riskmanagement process in which sharedmetrics andriskmanagementsystemic analysis areroutinely accomplished Useof asingleriskmanagement databasewith establishedlinks between Risk
Management/Technical Reviews/TPMs/EVM/WBS/IMS
Documentedtraceability of mitigation efforts A riskmanagement process coupledto changecontrol activities An enterpriselevel view of riskmanagement to prevent theacquisition program from beingadversely affectedby otherenterpriseacquisition programs or enterprisewide challenges. Theriskmanagement information in theDAGChapter 4 andin theRiskManagement Guidefor DoD Acquisitionaresources of otherindicators of an integratedriskmanagement process. Technical Reviews Incentive A contractors technical reviewprocess is considered extremely importanttoprogram success. Awardfee criteriashouldincludetimely, or early, completion of design reviews, andawardfee shouldbereducedor eliminatedif design reviews arecritically late. . TheDAG Chapter4 elaborates furtheron key technical reviews. An importantelement of any awardfeeplan is to ensurethat key criteriaaremeasurable tominimize potential for subjectiveevaluations andthereforehaveaclear understanding between Government andcontractor regardingperformance incentives. 2.5 Government and IndustryInteraction

Thereshouldbean environmentof open communication prior to theformal source selection process (1) to ensureindustry understands theGovernment requirements, andthat the Government understands industry capabilities andlimitations and(2) toenhance industry involvement in theGovernments development of aprogram Acquisition Strategy. Duringthe presolicitation phase,theGovernmentdevelops thesolicitation andmay askindustry toprovide important insights intothetechnical challenges, program technical approach, andkey business motivations. [Note:TheCOis theGovernments principal point of contactwith industry]. For
ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

11

example, potential industry bidders couldbeasked for their assessment of proposedsystem performancethat is achievablebasedon thematurity level of new technology as well as existing technology. TheGovernmenttakes theleadershiprolein this stage. Lessons learnedfrom past programs suggest thatcontract formation can bevery productivewhen ahighly collaborative environment is createdinvolvinguser, acquisition, sustainment, andindustry personnel to understandandcapturethetechnical challengeand technical andprogrammaticapproaches neededtosuccessfully executeaprogram. As can beseen from theIntegrated DefenseAT&L LifeCycleManagementFramework, Market Research begins early in thelifecycle(i.e., CR/TD phases) as part of initial riskanalyses activities. TheCOmay developandprovidetoindustry a draft RFPto enhance an understandingof thecustomerneeds andtheindustrys capabilities to costeffectively meet theseneeds. 2.5.1 MarketResearch FAR Parts 7, 10, and11requiretheGovernmenttoconductacquisition planning, to includemarket research (DAG2.3.16.1.4.1 and10USC2377), as away to establish the availability of products andvendors which can meetpotential needs. Marketresearch supports theacquisition planninganddecision process by supplyingtechnical andbusiness information aboutindustrys technology, products, andcapabilities. Market research can beusedto obtain additional information on acompanys technical andmanagementprocesses capabilities along with their domain expertise(DAG4.2.5). Thesefactors can beassessedduringsource selection, ratherthan by market research. Market research shouldalso beusedtoidentify any requiredsources of supplies or services (FARPart8) andrestrictions or otherissues regarding foreign sources of supplies (FARPart25). 2.5.2 IndustryDays Beforereleaseof aformal RFP, theGovernmentmay holdIndustry Days to inform industry of thetechnical requirements andacquisition planningplus tosolicit industry inputs for thependingprogram. Both largeandsmall businesses shouldbeencouragedto attend. TheCO will establish theagendafor Industry Day andthegroundrules for interchangewith industry representatives. Table22provides someexampletechnicalrelatedtopics for Industry Days.

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

12

Table22 ExampleTechnical Topics for IndustryDays 1. TheGovernmentshoulddescribeits commitment totheprogram andhowit fits into the Services or Agencys portfolio of programsits relationshipwith otherprograms. 2. TheGovernmentshouldemphasize anddescribeits overall technical approach tothe program andtheinterdependencies with cost andschedule. TheGovernmentSEP shouldbemade availableto industry as astartingpoint for their technical planning. 3. TheGovernmentandindustry shoulddiscuss trades andanalyses that havebeen conductedduringtherequirementsgeneration process. Whilesolution alternatives may bediscussed,theemphasis shouldremainon theresultingperformance (including supportability) requirements, noton thespecifics of thealternatives. [Note:Some potential offerors may choosenotto discuss specifics of their potential alternativein the presenceof potential competitors.] Theresults of Governmenttrades andanalyses shouldbemade availableto industry as appropriateJCIDSdocuments:AoA, ICD, draft CDD,Concept of Operations (CONOPS), etc. Thesediscussions areintendedto understandthespecificoperational andsustainmentrequirements critical to theprogram. 4. Whileit is necessary to investigatepotential design solutions that areresponsivetothe requirements, theGovernmentteam shouldavoidbecomingfixated with thesolutions. Theusersometimes becomes enamoredwith what helikestheprogram team focuses on theonethatworksandindustry has oneit wants tosell. Thefocus is on establishingthecosteffectivesystem performance requirements thatdeliverthe necessary warfightercapabilitynot pickingthedesign solution. Industry Days should inform thesolicitation development, notdefineasolution.
5. TheGovernmentpresentations anddiscussions shouldaddress theprogram Acquisition

Strategy, theSE approach as beingdevelopedin theGovernment SEP, andhow they wereestablished. Thediscussions shouldalso emphasize theimportance of Total Life CycleSystem Management (TLCSM)(DAGChapter 5, LifeCycleLogistics). 2.5.3 Draft Request for Proposal (RFP) TheCOmay releaseadraft RFPprior toaformal RFPto secureindustry inputs, comments, andsuggestions. TheGovernmentteam shouldmake thedraft as completeas possible. TheGovernmentshouldallowsufficient time(at theCOs discretion) for industry to respondandshouldseriously considerall industry suggestions andcomments andmodify the solicitation, as appropriateto reflect neededchanges. Aftertheformal releaseof an RFP, the exchangeof comments, questions, andanswers, etc., regardingtheRFPbecomestrictly controlledandareconductedonly through theCO. Itis much bettertomake changes beforethe releaseof theRFPthan toamendit afterward, which may requirean extension of proposal preparation time. Although Market Research, Industry Days anddraft RFPsareimportant, they arejust three of many tools availablefor exchanginginformation with industry. Otherexchanges of information includeIndustry or small business conferences publicmeetings oneonone meetings with potential offerors (with theapproval of theCO) Presolicitation notices Request for Information (RFI) Presolicitation or preproposal conferences andSitevisits. Thereaders arereferredto the FAR andtheDFARSfor moredetails.
ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

13

2.6

Technical Planningin the Source Selection Plan(SSP)

TheSSPdescribes theorganization of thesource selection team alongwith thefactors andsubfactors includedin Section M, EvaluationFactors (DFARSPart215). Theprograms technical approach, includingkey performance parameters andrisk, shouldbereflectedin the evaluationfactors. [Note: Factorsis theFAR/DFARSterm most SE /technical personnel use theterm criteriainterchangeably]. Figure23illustrates atypical Source Selection Organization. TheSource Selection Evaluation Board (SSEB) oversees theevaluation teams activities andbriefs thefindings totheSource SelectionAuthority, whichmakes thedecision. TheGovernments technical authority or theprograms LSE (DAG4.1.6, Systems EngineeringLeadership) shouldleadthetechnical evaluation team. Technical personnel (to includeGovernmentSMEs, e.g., system safety, security, IA) shouldparticipateon each panel (or committee) of thesource selection organization (seeFigure23), as necessary toassess each factor andsubfactor thatforms thebasis of thesource selection. Theevaluationfactors andthe subsequentevaluation rely upon personnel whoarequalifiedin thefunctional areaandhavethe past experience andqualifications necessary tomakean assessmenton proposal credibility. Thetechnical team supportingtheevaluation shouldincluderepresentatives from the acquisition organization, includingtheDefenseContractManagement Agency (DCMA), logistics/sustainment, otherappropriateSMEs, anduser organizations. Toensurecontinuity and promoteasmooth transition into contract execution, personnel who will beinvolvedin the program shouldalso beinvolvedin developingtheSSP andevaluationfactors. Itis strongly recommendedthat aqualified(e.g., toincludefamiliarity with theGovernmentSEP) technical/SE program representativealso beinvolvedin theManagement andPast Performance evaluation teams sincetheseteams will evaluatetechnical organization structure, skills, abilities, experience, andtechnical/SE managementbest practices tobeemployedby theofferor. Source selection procedures shouldminimize thecomplexity of thesolicitation by only requiringtheinformation necessary tomake adecision andlimitingevaluationfactors/ subfactors tothosethatare key discriminators thus enablingthesource selection decision while fosteringan impartial andcomprehensiveevaluation of offerors proposals andselection of the proposal representingthebest valueto theGovernment.

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

14

SystemsEngineering ShouldbeIntegratedinAll SourceSelection Factors


Source EVALUATIONFACTORS Selection Cost* Authority Quality ofProduct* PastPerformance* Contracting Technical Advisor Officer(CO) e.g.,ILS,excellence ManagementCapability SourceSelection Advisory e.g.,Personnelqualifications Council SmallBusiness
*Mandatory Factors(FARPart15)

SourceSelectionEvaluation Board(SSEB)

Technical Section(3.2.1*) SOW Technical Solution SEP Technical Supporting Data System Performance Specification

Management Section (3.2.2*)

PastPerformance Section(3.2.3*)

Cost Section(3.2.5*) SystemsEngineeringCosts WBS

Technical / PastPerformance ManagementIntegration Criteria SOW SEP IMP/IMS PastPerformance Questionnaire

*Referencedsectionsexpand thetopics

Figure23 Typical Source SelectionOrganizationand Factors An offerors proposal must respondtoall of therequirements of theRFP. [Note: Proposal may still not besuccessful, i.e., win theaward]. However, thequality of theproposal has adirect correlation to theclarity andcompleteness of theGovernments requirements in the RFP. [Note:Any ambiguities in thesolicitation will beheldagainst theGovernment in theevent of adispute]. TheGovernmentshouldassign its best personnel tothepresolicitation team and thesource selection team. Thesource selection team will beexercisingtheir judgment and critical thinkingwhen makingaselection, andthis is best servedusingexperiencedpersonnel thathavedomainexperience, technical expertise,specifically SE andotherspecialty areas noted above, andprogram knowledge. It may benecessary to train someof theteam in thesource selection process in moredetail than providedherein.

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

15

REQUESTFORPROPOSALANDSOURCE SELECTION

TheRFPincludes theterms andconditions that will bein thefinal contract. TheFAR subpart15.204.1 specifies theformat andcontent of RFPsolicitations andcontracts. TheRFP typically includes twocategories of documentation: Program Documents: Government RoadmapSchedule,IncentivePlan, GovernmentSEP, ISP, TRA, TES/TEMP,andpreliminary SPSareexamples of program documents whichmay beattachedtotheRFPor availableinaBidders Library. Otherdocumentation, such as ICD, CDD, otherJCIDSdocuments, COTS/GOTS data, FoS/SoS interfacedata, andreports from previous phases of theprogram arealso typically includedin theOfferors Library. Thesedocuments providebackgroundon theprogram anddescribetheGovernments management andtechnical approach to thesystem acquisition. [Note: Several of these documents arerequiredfor MilestoneBandaredescribedin theDAGChapter4]. RFPDocuments: A typical RFPincludes amodel contractwith any special clauses (e.g., CLINs, SOOor SOW, CDRL), Preliminary WBS, Evaluation Factors (Section M), and Instructions to Offerors (Section L). TheRFP(with theprogram documents referencedin theRFP) defines theprogram andsets thebasis for thecontract. Thefollowingsubsections address guidance that couldbeconsideredtoincludein Sections C andJ andSections L andM of an RFP. 3.1 Sections Cand Jof theRFP

Section C (includes Description/Specification/SOOor SOW) of theRFPcontains the description of theproducts tobedeliveredor thework to beperformedunderthecontract. This section typically includes theGovernments SOO(or SOW) andpreliminary system performancespecification. Section J, List of Attachments, lists theattachments such as initial IMP, TopLevel Program Schedule,GovernmentSEP, CDRLs, andContractSecurity Classification Specification (DDForm 254). 3.2 Sections Mand L of the RFP

Pleasenotethatwe haveselectedto discuss thespecifics of Section M beforeL since that is theorder in which theeffort is needed. Evaluation Factors aredefinedbeforeonecan completetheInstructions toOfferors. In order toaccommodatevariations amongtheServices source selection processes, RFPformat nuances, anddifferences amongprograms, thediscussion of Sections M andL is segmentedintofour general topics:i.e., Technical, Management, Past Performance, andCost(see Figure23). The technical developers of thesesections mustwork closely with thecontractingofficer to ensurecompliance with appropriateregulations. The followingsubsections includebrief discussions of each topicandexamplelanguage (in shaded Tables) thatcan betailoredfor program RFPs (or othertypeof solicitation). [Note: Itis important torememberthat thefocus of this guideis on thetechnical elements of theRFP, and thesampleitems must beintegrated with therest of theRFPto fit theoverall program strategy andprogram implementation approach]. Section M of theRFPstates theevaluationfactors thatareusedfor selectingthe contractor. Section M shouldbecarefully structuredtoaddress only thoseelements determined tobediscriminators in thesource selection toselectthebest proposal with acceptableprogram risk. ThemosteffectiveSection M evaluationfactors aremeasurable, relevant to theprogram,
ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

16

traceable, with expecteddifferentiation amongtheoffers, andundertheofferors control. Section M shouldnot contain any evaluation factors or subfactors for which thereis nota correspondingrequest for proposal information in Section L. AppendixA has additional tips for program teams when developingSection M. Section L of theRFPinstructs the offerors on howtostructuretheir proposal andwhat shouldbeincludedin each proposal section. Itneeds toclearlyidentify thestructureand composition of each volumeandsection of theproposal andshouldtracktotheevaluation factors in Section M. In preparingSections L andM, beawareof theproposal preparation timeandpage limitations. Askonly for information that shouldbereadily availableto offerors andthatis necessary toaccomplish thesource selection evaluation. Table31contains alist of SE relatedquestions tohelptheteam developthetechnical aspects of Section M andSection L. Table31 SampleQuestions for DevelopingSpecificSERelatedCriteriaand Instructions for Sections Mand L 1. Howwill theevaluation team establish an understandingof theofferors technical approach? 2. Howcan theevaluation team developconfidencethat theofferors proposedtechnical design solutions will meetall technical requirements, includingoperational performanceandlogistics/sustainment requirements? 3. Is thetechnical approach implementedwithinperformance, cost, andschedule requirements? 4. Howwill theevaluation team evaluatetheSoS or FoSinterfaces andintegration issues on theprogram? 5. Howwill theevaluation team establish whetherthespecificplans for implementing andmanagingthetechnical (i.e., SE) andtechnicalmanagementprocesses arebased on company enterpriseprocesses? Is thereobjectiveevidence of thecapability or maturity of theseprocesses basedon industry bestpractices? Howwill they be evaluatedfor consistency andcompatibility with theGovernments technical and management processes (as definedin theSEP)? 6. How will the evaluation team determine thatthe domain experience, past performance, and process maturity of thespecificproject team, company subgroup, teammates, and subcontractors proposedtoexecutethework directly relatedtotheprogram being bid? 7. Howwill theevaluation team understandwhethertheproposed technical solution is adequately supportedby studies, analyses, modelingandsimulations, and demonstrations? 8. Howwill theevaluation team evaluatethefidelity andappropriateness of modeling andsimulation proposedfor theproject, andhow will itbevalidated? 9. Howwill theevaluation team determinewhethertheofferor's proposedIA approach solution meets DoD requirements? Also for any security or safety engineering requirements. 10. Howwill theevaluation team assess thematurity andapplication of theofferors proposedprocesses in theproposal riskassessment?
ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

17

11. Howwill theevaluation team determinethat theriskmanagement approach proposed is appropriatefor theprogram beingbid(e.g., consistent andcompatiblewith the Governments riskmanagementprocess). 12. Howwill theevaluation team determinethattechnical cost andresources proposedfor theprogram arereasonableandrealisticfor theplannedprogram approach? 13. Howwill theevaluation team establish that the offerors proposedscheduleis realistic andthatthecritical path(s) analysis is realistic? Oral presentations by offerors may substitutefor or augment written information (see FAR 15.102). Useof oral presentations as asubstitutefor portions of aproposal can be effectivein streamliningthesource selection process. Oral presentations may occur at any time in theacquisition process, as determinedby theCOor Source Selection Authority, andare subject tothesamerestrictions as written information, regardingtiming(FAR 15.208) and content (FAR15.306). [Note: Discussions may or may notbepermittedduringoral presentations]. Information pertainingto areas such as an offerors capability, past performance, work plans or approaches, staffingresources, transition plans, or sampletasks (or othertypes of tests) may besuitablefor oral presentations. Theevaluation team may includeamatrix in theRFPthatcorrelates Section L toSection M so thatit is clear what portions of theproposal areexpectedto contain information usedto evaluateeach Section M evaluationfactors. It may also beappropriateto developamatrix that includes otherRFPdocuments. Thenext sections presenttechnical exampleitems for inclusion in sections M&L of the RFPas they relateto the three key evaluation factor areas (i.e., Technical, Management, andPast Performance). Additionally, we address overall Proposal Evaluation, CostFactor Evaluation, andRiskAssessmentEvaluation. 3.2.1 Technical Factor Evaluation Thecoreof thetechnical evaluation centers on the offerors system performance specification, thedescription of thetechnical solution, andany supportingdatarelatedto trade studies, analyses, modeling, andsimulations that havebeen requestedin Section L. [Note: Recall we present section M (Evaluation Factors) examples beforesection L (Instructions (e.g., Proposal Content) for Offerors) examples dueto precedence in thedetermination.] 3.2.1.1 Technical Solution and Technical SupportingData

An offerors technical solution, in responsetotheSOW andotheridentified requirements, will, in part, bebasedon analyses that arebasedon technical supportingdataand resultingperformance specifications. Thesetopics arediscussedbelow. Therearetwogeneral types of technical datarequestedin most RFPs. First, thereis the description of theproposed technical solution andresultingperformance as it relates tothe Governments requirements. [Note: A discussion of thespecifictechnical datathat describes the offerors product offeringis notaddressedheresince it is uniqueto each program]. Table32 andTable33contain sampleitems for inclusion in Sections M andL, respectively, for the supportingtechnical data. Thesecondtypeof dataincludes trade studies andanalyses, including modelingandsimulation results, that providesubstantiatingdatashowingnotonly the
ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

18

performancebut also theextent andscopeof alternativesolutions consideredbeforearrivingat theproposed technical solution andspecification. Oftenwhysomethingwas discardedis as important as whatwas selected. Table32 SampleEvaluation Criteriafor Technical Solutionand Technical SupportingData Thetechnical solution andtechnical supportingdatafactor (subfactor) is satisfiedwhen Offerors proposal demonstrates: 1. TheOfferor has conducteda series of trade studies, analyses, andmodelingandsimulations thatsystematically evaluatedtherangeof alternatives leadingtoapreferredtechnical solution. Theresults supportthetechnical andprogram requirements andvalidatethe proposedconfiguration andthecorrespondingperformance in thesystem specification. 2. Thetrade study process was uniformly andconsistently appliedandfollowedtheOfferors documentedcorporateenterpriseprocesses. 3. Trade study anddecision criteriaaddressedthecritical cost, schedule, technology, risk, and performancerequirements (includingoperational andsustainment) andotherconsiderations for theprogram with ahigh degreeof confidence Table33 SampleProposal Content Requirement for Technical Solutionand Technical SupportingData TheOfferor shall provideasummary of thetrade studies andanalyses accomplishedtoarrive at theproposed technical solution. TheOfferor shall: 1. Describethetrade study, analysis, andmodelingandsimulation processes implementedto arriveat theproposed technical solution explain thelevel of fidelity of themodels and simulations tosupportaccurateandreliableresults. 2. Provideasummary of thetrade studies, demonstrations, andanalyses results that support theproposed technical solution andprogram technical approach. 3. Provideadescription of thetrade study evaluation criteria, howthey relateto thekey performancerequirements andconstraints for theprogram, andtheplannedtechnical approach addressedin thecontractors integratedSEP. Thedatashall address therangeof alternatives consideredandtheimportantresults that supportthetechnical decisions and theprogram technical approach. If thecontractor plans tomaturea technology, backup plans shouldbeassessedas well as riskmitigation planning. 3.2.1.2 System PerformanceSpecification(SPS)

A preliminary SPSis includedin Section C of theRFP. This specification defines the Governments performance requirements for thesystem. Theofferor responds with aSPS in their proposal thatis tobein thecontract. Table34andTable35contain sampleitems for inclusionin Sections M andL, respectively, for thesystem performancespecification. Rememberwe areusingan SOOor SOW for an RFPas thenominal examplesolicitation information. Thesecan betailoredandmodifiedas appropriatefor othersolicitation packages. Theofferors specification includes theGovernmentrequirements plus any derivedrequirements necessary todescribethesystem level performance. It may includeallocation of requirements andshouldincludecorrespondingverification requirements. TheSPSshouldnot includeSOW

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

19

language, tasks, guidance, or datarequirements but shouldreference necessary industry and approvedmilitary specifications andstandards. Table34 SampleEvaluation Criteriafor System PerformanceSpecification TheOfferors system performancespecification will beevaluatedin conjunction with the
proposed technical solution basedon thefollowingcriteria:
1. Specification includes thekey requirements andfunctionality identifiedin theRFPs
preliminary system performancespecification.
2. Performance (includinglogistics/sustainment/support) requirements arequantifiableand testableand/or verifiable. 3. Objectivevalues (goals) areclearly identifiedanddistinguishedfrom firm requirements. 4. Theoperational andsupport environment is describedanddefined. 5. Environmental design requirements arespecified. 6. Functional, electronic,physical, hardware, andsoftwareinterfaces for thesystem are
included.
7. System FoSandSoSinteroperability andinterface requirements areestablished(both
physical andfunctional). Considers Open Systems andModularity standards.
8. Appropriateuseof Governmentandindustry specifications, standards, andguides. 9. Verification approaches for all system performanceandsustainability requirements
includedin thespecification arecompleteandappropriate.
10. Thespecification does not includeunnecessary requirements andlanguage (e.g., SOW
tasks, datarequirements, andproduct or technical solution descriptions).
Table35 SampleProposal Content for SystemPerformanceSpecification TheOfferor shall proposeaSystem Performance Specification that meets theGovernment minimum requirements. Thespecification shouldbeperformance basedandaddress the allocation of Governmentperformance requirements plus any derivedrequirements necessary todescribetheperformanceof theintegratedsystem solution. Elements to beaddressedin the System Performance Specificationinclude: 1. Accurateandcompleteunderstandingof theperformance andsupportrequirements in the Governments preliminary system performance specification includedin theRFP. 2. Derivedrequirements necessary todocument thesystem performance andsustainability
thatwill govern thedesign, development, andtest program.
3. Identifiedanddocumentedsystemlevel operational, physical, andfunctionalinterfaces that definetheprogram external interfaces andconstraints. SoSandFoSinteroperability and interface requirements areincludedfor both physical andfunctional interfaces. Include considerations for Open Systems design. 4. Averification section to thespecification that delineates theapproach toverifyingall
performanceandsupportcharacteristics.
5. A crossreference matrix showingthetrackingof Governmentperformance requirements to theOfferors proposedsystem performance specification (i.e., traceability). The specification shouldbestructuredfor theproposed system solution andnot restrictedby the structure of theGovernments preliminary system performancespecification. Include crossreference toverificationmethods.

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

20

3.2.2 Management Factor Evaluation Thesixteen technical management andtechnical processes, as definedin theDAG4.2) areas follows: Technical Management Processes Decision Analysis Technical Planning Technical Assessment Requirements Management RiskManagement Configuration Management DataManagement InterfaceManagement Technical Processes Requirements Development Logical Analysis Design Solutions Implementation Integration Verification Validation Transition

Theseprocesses arenormally evaluatedusingacombination of theofferor's proposal documents. An offeror is expectedtodefineatailored, as appropriate, set of technical and management processes, usually basedon its own set of matureenterpriseprocesses. These processes areusually correlatedwith industrywiderecognizedstandards andbest practices. One well known approach is basedtheCapability Maturity Model Integration (CMMI), which has been particularly useful inprocess improvement initiatives. Theacquisition evaluation team is cautionedthat thereis riskin acceptingtheapplicability of an organizations CMMI maturity (or capability) level ratingtofutureprogram teams andefforts. Future performance is driven by alargespectrum of issues such as specificsuppliers/vendors andcompatibility of their respectivecorporate processes, interaction of different corporate units within andacross suppliers, theamount of newhires versus existingstaff that will beassignedto thecontract, timingandintensity of training, for all team members,applicability of domainspecific knowledge as aperformance factor, andtheextent with which corporate processes will be appliedtothenew work. In this guide,suggested technical managementSection M evaluationfactors arepresented in an integrated example(see Table36). Thesefactors will correlatewith appropriatesamples preparedindividually for proposal contentin Tables 37to310.

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

21

Table36 SampleIntegratedEvaluationFactors for Technical Management This factor (subfactor) is met when theOfferors proposal demonstrates: 1. Theprogram tasks in theSOW arefully identifiedandincludethetechnical tasks. 2. Technical planningis completeandsupports implementation of theprograms technical approach andaccomplishment of therequirements andobjectives containedin theRFP. 3. Technical andtechnical management processes areimplementedacross theprogram team, usingappropriateandadequatetools. 4. TheOfferor has implementedatechnical baselineapproach (functional, allocated, and productbaselines) that support theprograms technical approach. Data andsoftwarerights areclearly explained. 5. Technical processes arematureandstableandrepresent theOfferors application of corporate enterpriseprocesses andlessons learned. 6. Approach, tasks, processes, andprocedures arefloweddown to thesubcontractors, vendors, andlowest level suppliers, as appropriate. 7. A trainedworkforce (familiarwith theprocesses, practices, procedures, andtools) is availableandin placetoensureaccomplishment of thework. 8. Requiredprofessional certifications (such as IA requiredby DoDD8570.1) areheldby offeredpersonnel. 9. Technical events areincludedin theIMP/IMSandreflect thetechnical approach. 10. TheIMPnarratives includethetechnical andtechnical managementprocesses andsub processes (as appropriate). 11. TheIMSclearly indicates theprograms critical path(s) andhas acceptableschedulerisk. 12. Technical reviews areidentified explicit entry and exit criteria participation established andhavethetimingandfrequency necessary tomonitor andcontrol technicalbaseline maturity andriskmitigation. 13. Thereis asingletechnical authority that is responsiblefor program technical direction. Thelines of responsibility andauthority areclearlyestablished. 14. Key personnel areassignedandpersonnel resources identified. 15. Theroleof theGovernment(program office,supportingGovernmentorganizations, and user) alongwith thekey subcontractors has been identified. 16. Program IPTs areestablishedthatinvolveprogram participants andstakeholders for all LifeCyclephases andidentify roles andresponsibilities. 17. Programspecificplans represent asoundintegratedtechnical approach. Theplans are floweddown totheteammates, subcontractors, vendors, andlowest level suppliers on the program. Theplanningis integratedacross theSOW, SEP,IMP/IMS,andotherprogram management plans andprocesses tosupportcritical path analysis, EVM, andrisk management. 18. TheOfferors SEP shouldthoroughly document theOfferors technical approach tothe integratedsetof program requirements, technical staffingandorganization planning, technical baselinemanagement planning, technical reviewplanning, andtheintegration with overall management of theprogram. Itshouldclearly showhowit is integrated, consistent, andaligned(but moredetailed)with respect to theGovernments SEP. 19. Proactive, disciplinedSE technical management process leadingindicators that providea pictureof futurecoursethat a program is likely to follow. Theindicators shouldbe measurable, mapto incentivestrategies andresult in early identification andmitigation of risk.
ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

22

Morespecifictechnical suggestions for individual proposal content (per Instructions in Section L) examples arepresentedfor each of thefollowingsubsections of atypical proposal, i.e., SOW, SEP,IMP/IMS, andIMSNarratives for theManagement Volume. 3.2.2.1 Offerors Statement of Work (SOW)

Theofferor responds totheRFPwith aSOW [Note:also referredto as theContractors SOW (CSOW)] thataddresses theobjectives statedin theGovernments SOOor SOW, other sections of theRFP, andderivedrequirements basedon theofferors approach. TheSOW defines tasks andactivities that theofferor proposes toexecuteunderthecontract. Thetechnical approach relies heavily on contractors processes andpractices, andtheSOW shouldaddress the application of theprocesses duringthedesign, development, test, manufacture, delivery, and sustainment, as applicabletotheprogram. It is generally not theintent toincorporatethe contractors detailedprocesses andpractices into contract. Since theSOW will becomea baselinefor theresultingcontract, it shouldbethoroughly reviewedtoensureit adequately addresses all thework to beaccomplishedduringtheprogram. Table37provides sample proposal content to beplacedin Section L languagefor theSOW. Table37 SampleTechnicalProposal Content for SOW TheOfferor shall provideaSOW tobeincludedin thenegotiatedcontract. (In thecasewhere theGovernmentprovidedaSOW with theRFP, theOfferor may proposechangeshowever, each changeshall beaccompaniedby supportingrationaledemonstratingwhy acceptingthe proposedchangeis in theGovernments interest.) TheSOW shall: 1. Describethetechnical work, tasks, andactivities tobeaccomplishedon theprogram that reflect thetechnical approach to theprogram as describedin theOfferors SEP. 2. Reflect useof technical andtechnical managementprocesses across theprogram thatare critical for program success. 3. Address thetechnical baselinemanagement process (functional, allocated, andproduct
baselines).
4. Address delivery of, anddescribetheGovernments rights in, all requiredtechnical data andcomputersoftware. 5. Providefor eventbasedtechnical reviews with entry andexit criteriaandindependent
SMEparticipation.
6. Providefor technical planningandtheOfferors integrated SEP updates andcontinuous process improvement consistent with corporateimprovements andprogram needs. Explain howperformance requirements will beverified. 7. Discuss theOfferors SOW as structured for theproposedsystem solution andnot
restrictedby thestructureof theGovernments SOOor SOW. This is correlatedand
consistent with theintegratedWBS, IMP, IMS, andEVMS.
3.2.2.2 Offerors SystemsEngineeringPlan(SEP)

Theofferor shouldconsidertheGovernments plannedtechnical managementstrategy andapproach, as reflectedin theGovernmentSEP, to prepare their proposals includingtheir own integratedSEP. As aresult, many elements of the Governments SE strategy will bereflected within thecontractdocuments (e.g., SEP, SOW, CDRL, IMP, IMS, andWBS). Itis suggested
ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

23

thatinstructions for theofferors SEP (see Table38) in Section L includetherequirement for theofferors to provideamatrix that correlates theGovernment SEP with theofferors SEP, contractual documents, andothervolumes of theproposal whereSEPamplifyinginformation is discussed. Table38 SampleTechnicalProposal Content for SEP TheOfferor shall submit aSEP thatdescribes their integrated technical approach to the program. TheOfferors SEP shall include: 1. Theentirecontract relatedrequirements, tasks, activities, andresponsibilities includedin theGovernmentSEP, as they relateto this solicitation, shall bein alignment. If theOfferor elects to changeor revisetheplannedtechnical approach describedin theGovernments SEP, therationalefor thechangeshall beprovided. 2. A description of thekey technical andtechnical managementprocesses. ProvideOfferors (andteammates, subcontractors, etc.) plans for continuedprocess improvement. 3. Flowdown of technical andtechnical managementplans andprocesses to the
subcontractors or teammates, andhowthey participatein theprocesses.
4. An eventbasedprogram plan (correlatedandconsistentwith theIMP) for theefforts
involvedwith thedesign, development, test, production, andsustainment, including
plannedblockupgrades, technology insertion, etc.
5. Plannedtechnical reviews with entry andexit criteriaandindependent SMEparticipation. 6. Identity of thetechnical authority, stakeholders, andfunctional technical authorities on the program andthelimit andscopeof their responsibilities. 7. A description of thetechnical organization within theprogram IPT structure identifying roles andresponsibilities, key personnel, andtechnical staffingrequirements. Identify the primary participants withineach IPT andthesupportingparticipants toincludethe Governmentandsubcontractors. Includeasummary of theprincipleproducts of theIPTs. Includeadescription of technical workinggroups (or IPTs) with roles, responsibilities, and proposedparticipants (e.g., InterfaceWorkingGroup, T&E WorkingGroup, and Technology RoadmapWorkingGroup). 8. Integration of thetechnical andtechnical managementprocesses with IMP/IMSandEVM processes. 9. A summary description of theproposedsetof program planningandspecificplans such as SEP, TEMP, ISP, SoftwareDevelopment Plan, PSP, RiskManagement Plan, etc., toensure consistency andcompleteness. 10. Amatrix that correlates theGovernmentSEP, withtheOfferors integratedSEP,proposed contractual documents (SOW, IMP/IMS,WBS), andothervolumes of theproposal where SEPamplifyinginformation is discussed. 3.2.2.3 IntegratedMaster Plan/IntegratedMaster Schedule(IMP/IMS)

TheRFPshouldcontain aGovernmentRoadmapSchedule(IMP/IMS3.1.1) that depicts themajor program elements andkey milestones, such as contractaward, eventbased technical reviews, technical baselinedevelopment andlockdown, developmental testand evaluation (DT&E), operational testandevaluation (OT&E), production or longleaddecisions, andsystem delivery. Typically, most of theevents containedin theprogram IMParebasedon technical activities andnormally includetheSDDitems thataredescribedin theDAG4.3.3.

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

24

TheIMPandIMSshouldclearly demonstratethattheprogram is executablewithin scheduleandcostconstraints andwith acceptablerisk. They shouldprovideafunctionally integratedpictureof theproposedprogram with adirect correlation between theeventdriven activities in theIMP/IMSandtheSOW andCWBSplannedtechnical approach documentedin theSEP (see Table39). Thus, theIMP/IMSandSEParekey elements duringtheproposal evaluation andsource selection. Finally, theIMP/IMSmust becorrelatedandconsistent with the definedCWBSandEVMS. Table39 SampleTechnicalProposal Content for IMP/IMS TheOfferor shall submit an IMPthat is structuredas an eventbasedschedule. Technical reviews applicabletothecontractedevent shall beincludedas events. Thematurity of the technical performance approach as well as status of riskaction plans will bereviewed. The IMPshall includeevents, accomplishments that tie to theseevents andcompletion criteriafor each accomplishment for thetotal contractedeffort. Any blockupgrades andtechnology insertions identifiedas options for this contractedeffort shall also beincluded. TheGovernment TopLevel Program Plan andScheduleandSEP, includedin this RFP, defineaminimum set of technical events to beincludedin theproposedIMP. Criteriafor entry intoany technical event will betiedto theassociatedaccomplishment completion criteria. TheOfferor may includeadditional technical events with associatedaccomplishments andcompletion criteriaor morerigorous completion criteriaas required. TheIMSshall includetheprogram schedulewith technical tasks andactivities necessary tocompletetheworkeffortscopedwithin theIMP. Theprograms critical path(s), basedon critical path analyses, shall beidentifiedin theIMS. Theresults of ascheduleriskassessment shall bepresentedwhich reflect acceptableschedulerisk. [Note: Itis notuncommon for the Governmentto specify aminimum scheduleriskvalue, such as, 80percentprobability of achievingthekey event(s) with 80percentconfidence.] If theassessment concludes that scheduleriskis unacceptable,theOfferor shouldadjust thescheduleor includeriskmitigation efforts. Finally, theIMSassociation with theEVMSshall besummarized andwill beaddressed in theCost Volumealso. Someprograms may requireaProcess NarrativeSection with an IMP. Sampletext is providedin Table310. [Note: A technical narrativemay not benecessary since theofferors requiredintegrated SEP will probably covertheappropriatenarrativeinformation (IMP/IMS 3.3.3).] Table310 SampleProposal Content for IMP Narratives TheOfferor shall includewithin theIMPprocess narratives abrief synopsis of theOfferors systems engineeringandtechnical processes consideredessential for program success. The narratives shall reference theOfferors corporateprocesses andbest practices andindicatehow they will beappliedandtailoredtothespecificprogram.

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

25

3.2.2.4

Other Technical Management Criteria

TheManagementVolumecan alsobeusedtohighlight specifictechnical management topics that arediscriminators for thesource selection. Thesetopics arethose theGovernment seeks addedinformation or datafor theevaluation overandabovewhat has been addressedin theprevious sections on theSOW, SEP,andIMP/IMS(see Table311). Thesecriteriashould notbeusedto systematically address all technical andmanagement processes tobeusedon the program (theseshouldhavebeen includedin theSEPor IMP narratives). This is why its important thatan experience systems engineer also beon theManagement Factor evaluation team. Table311 SampleTechnicalProposal Content for Other Management Criteria TheOfferor shall submit aManagement Volumethat describes thekey technical processes and howthey areintegratedwith theothermanagement, financial, andfunctional processes. Examples of technical topics for special emphasis in theManagementVolumeinclude: 1. FoSandSoS issues andintegration approach andnetcentricoperation requirements. 2. Program organization, roles andresponsibilities of IPTs, andspecifically theSEIPT.(see also Table38, #7) 3. Theelectronicor virtual program approachincludingdataandinformation exchange(see also Table37#2andTable38#2and#8). 4. Discussion of riskmanagement andconfiguration managementapproaches.( see also Tables 37#2and#3, 38 #2, #3, and#9) 5. Facilities for design, development, andtesting. 6. M&Sprocesses, M&Sfidelity, special facilities, M&Ssupporttools, andpast applications. [Note: this is aspecialtySE example](see also Table37#2) 7. Rsums andpast experience for thetechnical leadershipandkey technical personnel (see also Table38#7). 8. Discussion of program staffingrequirements, surgecapability, personnel recruiting, and program rampupactivities at program start(see also Table38#7). 9. Discussion of special engineeringrequirements andprocesses such as, security engineering, safety, flight certification, survivability/vulnerability, human systems integration (HSI), interoperability, spectrum considerations, information system security (e.g., IA) (see also Tables 37#1, and38#1). 10. Obsolescence requirements growth plans andtechnology insertion upgrade plans (see also Tables 37#1and38#4). 3.2.3 Past PerformanceFactor Evaluation TheGovernmentuses thepast performance recordtodemonstrate that the offeror possesses theskill andexperience toperform well andachievetheperformance requirements on thenew contract. An offeror with experiencedpersonnel in theapplicabledomain, bolstered with acrediblepast performance record, shouldresult in better contract performance (e.g., lower riskandcostwhilestill achievingtheusers performance requirements) (FAR 42.15, andFAR 15.305 as supplemented). Thesource selection team shouldrelateeach offerors past performancerecord to theSource Selection Authority (SSA) in amannerthat facilitates an integratedassessment with theremainder(e.g.,Technical, Management) of theofferors proposal.
ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

26

Whilethereis adirectrelationshipbetween past performance andthetechnical factors, each of theseevaluations must standonits own merit, is reported separately, andcannotchange theother. For example,thetechnical evaluation on softwarecouldshowno major weaknesses whilethepast performance evaluation couldrevealunsatisfactory past performance. Itis recommendedthatthepast performance evaluation groupstartwith thePast Performance Information Retrieval System (PPIRS) for past performance information. Most past performanceassessments utilize aquestionnairethat requests specificinformation aboutan offerors performance from their previous customers. Therespondentmay beaskedto providea ratingrangingfrom ExceptionaltoUnacceptableor N/Aandprovideabrief explanation of therating. This allows for thequestionnaireto befilledoutquickly andeasily, butthe keyto auseful assessmentis theevaluationofrespondents comments andrationalefor therating. Anothermeans to collect thedatais for past performance evaluators tocontacttherespondents andcompletethequestionnaireviaadiscussion or interview. TheDCMA shouldbeinvitedto participatein this source selection activity, as determinedby theCOor SSA. 3.2.4 Proposal Evaluation Theproposalmustberesponsive totheSectionL,Instructions toOfferor. TheSSEB (see Figure23) can only usetheSection M Evaluation Factors includedin theRFP. No other criteriacan beusedin thesource selection process, just as no outsidematerial otherthan that submittedwith theproposal can beused(except for Past Performance). TheSSEB will be exercisingtheir judgment andcritical thinkingwhen makingaselection andthis is best served usingexperiencedpersonnel that havedomain experience, technical expertise,andprogram knowledge. AppendixA contains additional tips that can aidin theevaluation of thetechnical (TableA1), management (TableA2), andpast performance factors (TableA3). 3.2.5 Cost Factor Evaluation Thecostevaluation shouldaddress evaluation of cost reasonableness, realism, andrisk. Theeffectiveanduseful evaluation of cost can bestbeaccomplishedwhen it is supportedby technical personnel who: (1) havetechnical knowledge in therelevant domain (2) havepast, handson experience (3) are familiarwith thescopeandobjectives of theprogram and (4) recognize theinterdependencies of cost, schedule, andtechnical performance. In aproposal, theBasis of Estimate(BOE) supportingrationaleandassociated assumptions shouldbebasedupon meaningful analysis, crediblehistorical data, past experience, andexpert judgment. TheGovernments most probablecost relies on theidentification of weaknesses within theproposal (e.g., inconsistencies between thetechnical approach andthe assumptions listedin theBOE) andthecomputation of adjustments totheofferors proposedcost or price. Price factors for commercial services or products are also addressed, as appropriate. Thetechnical portion of thecost evaluation tips arecontainedinAppendixA(TableA4). Thereshouldbespecificandcomprehensivetwoway communication about significant differences between theofferors proposedcosts or prices, andtheGovernments most probable costestimates of thesecosts or prices. Thegoal of thesediscussions is tofully understandthe
ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

27

reasons for, andthemagnitudeof, thedifferences between an offerors proposedcosts or prices andtheGovernment mostprobablecost estimate,includingkey cost elements. 3.2.6 Proposal Risk Assessment Evaluation Theproposal riskassessmentis typically reported at thefactor level, i.e., technical and management however, thereis an option toreport therisks at thesubfactor level. TheSSEB has twooptions when conductingtheproposal riskassessment. Thefirstmethodis toaccomplish theproposal riskassessment for each factor or subfactor. In this casethefinal evaluation of the factor wouldhavetwo componentsafactor score (usually denotedin acolor ratingif used)and aproposal riskrating(rangingfrom high risktolowrisk). Thesecondmethodis tocombinethefactor ratingandproposal riskratingtogether. In this case, for example,ablue(exceeds standardcolor rating) might beloweredto a green ratingif it involves somerisk. In an extremecase,ablueratingmight beloweredtoyellowor redif theriskis determinedtobehigh. In both cases theproposal riskassessment essentially answers thefollowingquestion. If theofferor does (or delivers) whatheproposes, whatis the riskthathewill notsucceed, i.e., notmeetkeyperformance andother criticalrequirements withinscheduleandresource constraints? This assessment establishes theriskassociatedwith theofferors proposedprogram to includethetechnical approach, technical performance, management approach, application and integration of management andtechnical processes, program schedule,andcost/resource allocations. Thetechnical portion of theproposal riskevaluation tips arecontainedinAppendix A(TableA5).

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

28

CONTRACT EXECUTION

Duringthefirst few weeks aftercontractawardit is important theGovernmentand contractor team havean interactive,facetofacemeetingandthatthetechnical leaders step forwardandset thetonefor theprogram. Three importantprogram activities immediately after contractawardare thePost AwardConference (seeTable41andFAR 42.5), theIntegrated BaselineReview (IBR) (see Table42andalsoTheProgram Managers Guideto theIntegrated BaselineReview (IBR) Process andDAG11.3.1.3), andSEPIntegration (see Table43), Table41 Systems EngineeringTasksduringPost AwardConference 1. Reinforce theimportance of havingtheGovernmentandcontractor engineering personnel functioningas an integratedteam, whilerecognizingtheresponsibilities that inherently residewith contractor (executingthecontract), theGovernmentProgram Office(program leadershipandcontract oversight), theuser, andDCMA. 2. Review theprogram technical approach andtheplan for alignmentof theGovernment SEP(includedin theRFP) with thecontractors inputs totheGovernment SEP. Validatetechnical tasks within theSOW. 3. Review thesystem performance specification to ensureamutual understandingof the functional baseline. 4. Reinforce theimportance of leveragingthecontractors domainexpertiseand implementingthecontractors enterprise technical andtechnical management processes as documentedin theproposal SEP. 5. Review andestablish theinitial set of metrics andmeasures (thebaseline)that will be usedto monitor andcontrol theprogram. 6. Review riskmanagement planningandtheRiskManagement Plan, if applicable,and thebaselineof theprogram risks. [Note: Programrisks includebothGovernmentand contractor risks.] Review theriskmitigation plans. 7. Review plans for eventbasedtechnical reviews (alongwith entry andexitcriteriaand independent SMEparticipation) documentedin theIMPandproposal SEP review the technical tasks andproducts resultingfrom theIMStasks andensurecorrelation of the technical metrics andmeasures, IMP/IMS, andtheEVMSin preparation for the IntegratedBaselineReview(IBR). 8. Review anddiscuss theissues andconcerns identifiedduringsource selection to ensurethey areunderstoodandany issues areresolved. 1. 2. 3. 4. 5.
6.

Table42 Technical Tasks duringtheIBR Review critical milestones andearly program support. Verify thetechnical approach of theprogram. Ratify theentry andexit criteriafor program events by reviewingtheevents, accomplishments, andcriteriain theIMP. Establish theIMStasks to supporttheprogram. Taskdurations, resources, and interrelationships shouldbereviewedandunderstood. Review theriskmanagement process, establishingthebaselineandmitigation plans. Verify acceptance of theintegratedProgram SEP. Establish theplan for future updates. Actively participatewith thefinancial personnel toestablish theEVM baseline. Verify that theEVMSis certified.

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

29

PresumingthattheRFPrequiredthesubmittal of an offerors SEP,then theactions to consolidatetheGovernment andofferors SEPsintoajoint, Program SEPshouldbe accomplishedimmediately aftercontractaward(seeTable43). Table43 EstablishingtheIntegratedProgram SEP Thefollowinggeneral approach toachievingan integrated (i.e., aligned)Program SEPis recommended: Immediately aftercontract award, theentireprogram team leadershipshouldestablish an integratedProgram SEPtoreflect both theGovernmentandindustry efforts on the program. This integrated Program SEPwill usually betwo documents theGovernment SEPandthecontractor integrated (thelatterbeingaCDRL). Together, they guideall program stakeholders as to thetechnical aspects of theprogram. Therecommendedapproach to transition theGovernment SEPintoan integrated Program SEP(Government andindustry) involves acontinuum of activity that begins during RFPpreparation andcontinues through source selection, contractaward, andinitial program start upactivities (see Figure41).

ProgramSystemsEngineering PlanShared Visionofthe ProgramsTechnicalApproach


Milestone

RFPPreparation AcquirersTechnical ApproachasDocumented inGovernmentSEP WrittenbyProgram Manager,LeadSystems Engineer,LeadTester, LeadLogistician, and other SMEs

SourceSelection Offerors Proposed TechnicalApproach basedonOfferors integratedSEP and othersupporting technicaldocuments EvaluatedbySource SelectionEvaluation Board

PostAwardPlanning ProgramTeamsTechnical ApproachasDocumentedin ProgramSEP andrelated technicaldocuments WrittenbyProgramManager, Lead Systems Engineer,Lead TesterLeadLogistician, and otherSMEs fromGovernment, PrimeContractor,Subs, and Suppliers

Execution Executethe Technical Approach Programintegrated SEP updatedby ProgramTeam

Figure41 EstablishinganIntegratedProgram SEP Although this guideis focus particularly on solicitation andcontract award, administering thecontract throughoutthecontractperiodis also important. From theprograms perspective, theProgram SEP,andsupportingdocumentation (e.g., RiskManagement Plan, TEMP,ISP), is thebasis tomonitor andcontrol theprogram (includingcontractors) activities andperformance. DCMA shouldbeinvolved,particularly with respect to FAR 42.302 on contract administration. Thefollowingareparticularly relevant to SE activities:

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

30

Perform engineeringsurveillance toassess compliancewith contractual terms for schedule, cost, andtechnical performancein theareas of design, development and production. Evaluatefor adequacy andperform surveillance of contractor engineeringefforts and management systems that relatetodesign development, production, engineeringchanges, subcontractors, tests, management of engineeringresources, reliability, maintainability, datacontrol systems, configuration management andindependent research and development. Review andanalyzecontractor proposedengineeringdesign studies submit comments andrecommendations to theCO. TheseSE activities arewell addressedin theDAGs SE technical andmanagement processes, particularly Technical Assessment (see alsoSection 3.2.2 of this guide).

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

31

APPENDIXA. DEVELOPMENT OF SEINPUT TOSECTIONSM, L, AND PROPOSAL EVALUATION TableA1through TableA5contain samplequestions that can aidtheprogram teams in development of technical aspects toincludein Section M andSection L for theRFP andthe subsequentevaluation of proposals duringthesource selection. TableA1 Technical Focus for theTechnical Factor Evaluation TechnicalFactor Reference Sections 3.2.1and3.2.4 1. Does theproductoffering(technical solution) meetperformance andsustainment
requirements? Does it exceedtherequirements, andis this of valuetotheGovernment?
2. Does theproductreflect therequiredspecial design considerations, such as, MOSA,
safety, information security, netcentricoperations, etc?
3. Areall thecritical or key requirements includedwithin thespecification? (Watch for parrotingof theGovernmentrequirements withoutregardtosubstantiatingevidence in theothersections of theproposal. A claim of performance without substantiatingdatais a technical risk.) 4. Arethegoals appropriately identifiedanddifferentiatedfrom firm requirements? (Goals
donot havemuch standingas contract performancerequirements.)
5. Arespecification requirements statedin performance language? Arethereany SOW tasks or datadeliveries in thespecification? 6. Is thespecifications Verification andTest Section moredetailedthan just atable
reflectingonly amethodof verification? Is therea onetoonecorrelation with the
Performance Requirements? Is thetest section consistent with theengineeringandtest
approach documentedin othersections of theproposal?
7. Arethesystem interfacerequirements identifiedanddocumented? Do they reflect the
requisiteSoSandFoSinteroperability andinterfacerequirements, as appropriate?
8. Do theanalyses, modelingandsimulations, andtradestudies support design decisions and technical approach to theprogram? Is theeffort comprehensive(i.e., includerelevant solutions, technologies, andalternatives) andaddress theareas of technical, cost, schedule, andrisk? 9. Does thetechnical approach address all phases of theproduct lifecycle(i.e., TLCSM)and effectively employ theprecepts of System Operational Effectiveness (SOE) andSystem Design andOperational Effectiveness (SDOE)? 10. Aretheresource/costestimatingfactors andassumptions for technical workandproducts supportedby theOfferors domainexperience andpast performance?

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

32

TableA2 Technical Focus for theManagement Factor Evaluation StatementofWork Reference 3.2.2.1and3.2.4 1. Does theSOW covertheentirescopeof theprogram, includingtherequisitetechnical tasks andactivities? Is it consistent with theprogram technical objectives in theSOOor SOW? 2. Does theSOW includeinappropriateitems on contract? (For example,Technical dayto day procedures andinstructions arecapturedin excessivedetail andthen, as they mature duringtheprogram, they cannotbeimplementedwithoutacontractchange.) Thegoal is tosecureacommitment to implementingtheprocess, notcontrollingdetailedprocedures. 3. Does theSOW identify specificIPTs that accomplish thetasks or includedates for startor completion of tasks? Thedates andIPTs are identifiedin theIMS. 4. Does theSOW includetasks for conductingeventbasedtechnical reviews consistent with theprogram technical andsupportapproachincludedin theOfferors SEP? 5. Areall theappropriate technical management processes andtechnical processes included? Systems EngineeringPlan Reference 3.2.2.2 and3.2.4 1. Does theOfferors SEPexpandandrefinetheGovernmentSEPprovidedin theRFP? Is it responsiveto the SEPPreparation Guide? 2. Arethecorporate enterprise processes to beimplementedon theprogram matureand stable? Is any tailoringor modifications to theprocesses appropriateto theprogram? Does thetailoringincreasecost, schedule, or technical risk? Has theOfferor made a commitment andimplementedplans for continuous process improvement? 3. Arethemajor technical reviews with entry andexitcriteriaandindependent SMEs includedin theSEP? 4. Has asingletechnical authority for theprogram been identified? Arethe technical teams roles andresponsibilities withintheOfferors proposedorganization been clearly defined andassigned? 5. Has theskill, experience level, andcorporatecommitment of key technical personnel been identified? Aretheresufficient manpowerresources identifiedandavailabletosupport the program? Areplans for transition andpersonnel assignments in placefor asmooth ramp upof worktasks without riskof delay? 6. Havethekey technical processes andtechnical management processes critical to program success been integratedwith theprogram management processes andreflect thetechnical approach in theSEP (e.g., requirements management, technical baselinemanagement and control, riskmanagement, earnedvaluemanagement, supportability, etc.)? IMP and IMS Reference 3.2.2.3and3.2.4 ReferenceIMP/IMS5.1.8 guidefor detailedguidancefor evaluation of an IMPandIMS. 1. AretheSOW tasks reflectedin theIMPandIMS, especially thetechnical baseline management, verification, andvalidation tasks andeventbasedtechnical reviews? Other ManagementCriteria Reference 3.2.2.4 and3.2.4 TheOtherManagementCriteriaproposal instructions shouldfocus on other technical related topics that might becritical to program success, andeach program shouldcarefully select thesespecial topics, ensuringthey areimportant discriminators for thesource. Examples of discriminatingprocesses theGovernmentmight seekaddeddetails oninclude: risk management, configuration managementandobsolescence andtechnology insertion planning. [Note: to theextent thesearenot adequately addressedin othersections of theproposal].
ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

33

TableA3 Technical Focus for thePast PerformanceFactor Evaluation PastPerformanceEvaluation Reference 3.2.3and3.2.4 1. In thecontracts thatarerelevant or highly relevant,are thetechnical approach and domainclearly applicabletotheproposedprogram? Arethecontracts similarin scope do they apply thesametechnical andtechnical managementprocesses with successful results? 2. Is technical experience of teammates andsubcontractors relevant to theallocation of technical tasks? 3. Do theresponses tothepast performancequestionnaireshowexcellent systems engineeringpast performance? Do theresponses supporttheTechnical andManagement Evaluation Criteriain Section M? 4. Havethesystems engineeringor technical element in Contractor PerformanceAssessment Reports (CPAR) been considered? Aretheretrends or systemicissues across several CPARevaluations that indicatepotential strengths and/or weaknesses in expected performance? 5. Do lowCPAR elements haveacorrectiveaction plan between theGovernmentcustomer andthecontractor? Is thecorrectiveaction on schedule? TableA4 Technical Focus for theCost Factor Evaluation CostEvaluation Reference3.2.5 1. Does thecost estimatefully represent thescopeof thetechnical requirements andaddress all thework anddelivery requirements? 2. Do thecostestimates correlatewith theproposedsolution andtechnical approach? Is the program proposedcost estimatefor thetechnical portions of theprogram reasonableand realisticbasedon your judgment, past experience, or technical knowledge? Areprogram cost, schedule,andperformancebalanced? 3. Aretheprocesses, theorganization, the technical tasks andproducts proposedin other sections of theproposal adequately resourcedandincludedin thecost? 4. Arethetechnical manpower estimates andBOE adequateandreasonablefor the organization, tasks, andschedulereflectedin theIMP, IMS, andSOW? Does theskill level of themanpower reflect thecomplexity of thetasks? Is theBOE supportingrationale basedupon crediblehistorical data,pastexperience,or expert judgment? 5. Is thetimephasingof theresources (manpower, facilities, andinfrastructure) consistent with theIMPEvents andtheIMStasks alongwith thetechnical approach in theSEP? 6. Arethecosts consistent with their proposedtechnical worktasks, products, organization andpersonnel resources, andpersonnel experience level, CWBS, IMP, IMS, andSEP? TechnicalQuestionsfor theWBS Evaluation 1. Is theCWBSbasedon thedeliverableproducts andservices andintegratedwith the program organizational structureand theIMP? 2. AretheWBSelements clear andunambiguous? Does theWBSdictionary adequately describethesystems engineeringactivities includedin otherproposal documentation, such as theSOW, IMP, IMS, andSEP? 3. AretheWBSelements clearly anduniquely assignedwithin theproposedorganization?

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

34

TableA5 Technical Focus for theProposal Risk Assessment Factor Evaluation ProposalRiskAssessment Reference 3.2.6 1. Aretechnical claims of performance supported by credibleanalyses, trade studies, or modelingandsimulation results? 2. Does theOfferors domainexperience supporttheprogram approach andthetechnical challenges on theprogram? Haveexperiencedpersonnel been proposedtoleadthe technical activities andorganization? 3. Arethetechnical andtechnical managementprocesses matureandstable,andarethe modifications to thecorporateprocesses appropriatetotheprogram? Do theprocesses (or lackof processes) introduce riskinto theprogram? 4. Aretherecorporateplans in placefor continuedprocess improvement? 5. Havethekey technical andtechnical managementprocesses determinedcritical to program success been integratedintotheprogram management andtechnical approach (e.g.,configuration management, requirements management, technical baselinecontrol, riskmanagement, technology insertion/obsolescence planning, modelingandsimulation planning)? Arethesefloweddown to teammates, subcontractors, vendors, andsuppliers, as appropriate? 6. Arethetechnical andtechnical managementprocesses integrated with theotherprogram managementprocesses (EVMS, IMP, IMS, andCWBS)? 7. Havethetechnical risks been evaluatedwith respect totheir relationshipto theprograms critical path(s)? Aretheriskmitigation tasks includedin theIMS? Aretheriskmitigation tasks reasonable,complete,andappropriatefor therisk? 8. Is theprogram schedulereasonableandrealistic? Is it consistent with theplanned execution of theprogram as statedin theIMP, IMS, andSEP? Havetheactivities on and near thecritical path been evaluated?

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

35

APPENDIXB. APPLICABLE REFERENCES Acquisition Community Connection (ACC) Practice Center(


https://acc.dau.mil/CommunityBrowser.aspx)
AmplifyingDoDD5000.1RegardingModularOpen Systems Approach Implementation
USD(AT&L), 5April 2004
http://akss.dau.mil/docs/wynn_memo,%205000.1%20and%20MOSA.pdf
Army Source Selection Guide (http://www.amc.army.mil/amc/rda/rdaap/docs/assg2001.pdf) AwardFeeContracts USD(AT&L), 29March 2006 (http://www.acq.osd.mil/dpap/policy/policyvault/20060334DPAP.pdf)
DefenseAcquisition Guidebook (DAG)
(http://akss.dau.mil/dag/)
DesigningandAssessingSupportability in DoDWeapon Systems:A Guideto Increase
Reliability andReducedLogistics Footprint OSD, 24October2003
(https://acc.dau.mil/CommunityBrowser.aspx?id=32566)
DefenseFederal Acquisition Regulation Supplement(DFARS) (http://farsite.hill.af.mil/VFDFARA.HTM) DoDEarnedValueManagement Implementation Guide(EVMIG) (http://akss.dau.mil/guidebookalphabeticLinks.do?initialChar=E) ImplementingSystems EngineeringPlans in DoDInterim Guidance, 30March 2004 (http://www.acq.osd.mil/ds/se/publications/pig/Implementing%20SE%20Plans%20In%20DoD
%20%20Interim%20Guidance%20%2030%20Mar%2004.pdf)
Federal Acquisition Regulation (FAR)
(http://farsite.hill.af.mil/vffara.htm)
Guidebookfor Performance BasedServices Acquisition (PBSA) in theDoD (http://www.acq.osd.mil/dpap/Docs/pbsaguide010201.pdf)
IncentiveStrategies for DefenseAcquisitions Guide, USD(A&T), 5Jan 2001
(http://www.dau.mil/pubs/misc/incentive.asp)
IntegratedMaster Plan andIntegratedMaster Schedule(IMP/IMS) Preparation andUse
Guide 21October2005
(http://www.acq.osd.mil/se/publications/pig/IMP_IMS_Guide_v9.pdf ) Joint Capabilities IntegratedSystem andDevelopment System (JCIDS), CJCSI 3170.01E, 31 May 2005 (http://akss.dau.mil/docs/CJCSI%20317001EFinal.pdf) MILHDBK881A WorkBreakdown Structure Handbook (http://www.acq.osd.mil/pm/currentpolicy/wbs/MIL_HDBK
881A/MILHDBK881A/WebHelp3/MILHDBK881A.htm)
Navy SD5, MarketResearch (http://www.ntsc.navy.mil/resources/library/acqguide/sd5.htm) Operation of theDefenseAcquisition System, DoDI5000.2 (http://akss.dau.mil/dag/DoD5000.asp?view=document&doc=2) OSDGuideto Collection andUseof Past PerformanceInformation (http://www.ntsc.navy.mil/resources/library/acqguide/PPI_Guide_2003_final.pdf)
Performance BasedAcquisition (PBA), USD(AT&L), 6Sep2006
(https://acc.dau.mil/pba)

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

36

Policy for Systems Engineeringin DoD, 20 February 2004 (http://www.acq.osd.mil/ds/se/publications/pig/Policy for Systems Engineeringin DoD 20 Feb 04.pdf) Policy Addendum for Systems Engineering, 22October2004 (http://www.acq.osd.mil/ds/se/publications/pig/Policy Addendum for Systems Engineering 22Oct04.pdf) Program Managers Guide:A ModularOpen Systems Approach (MOSA) To Acquisition, Version 2.0, September2004 (http://www.acq.osd.mil/osjtf/pmguide.html) th RiskManagement Guidefor DODAcquisition 6 edition version 1.0 August2006 (http://www.acq.osd.mil/se/ed/publications/2006%20RM%20Guide%20%204%20Aug%2006 %20%20final%20version.doc) Seven Steps to Performance BasedServices Acquisition (http://www.arnet.gov/comp/seven_steps/index.html) Systems EngineeringPlan (SEP) Preparation Guide, 10February 2006 (http://www.acq.osd.mil/se/publications/pig/sep_prepguide_v1_2.pdf) TheDefenseAcquisition System, DoDD5000.1 (http://akss.dau.mil/dag/DoD5000.asp?view=document&doc=1) TheProgram Managers GuidetotheIntegratedBaselineReview(IBR) Process (http://www.acq.osd.mil/pm/ibrmats/IBR Documents/IBR_PM_Guide_April_2003.doc)

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

37

APPENDIXC. ABBREVIATIONSANDACRONYMS
ACC AoA APB ASR AT&L A&T BOE CDD CDR CDRL CLIN CM CMMI CO CONOPS COTS CPAR CPD CR CSOW CWBS DAG DCMA DFARS DoD DT&E DUSD ED EVM EVMS FAR FCA FoS GOTS HSI IA IT IBR ICD ICE IDE IMP IMS IPT Acquisition Community Connection Analysis of Alternatives Acquisition Program Baseline Acquisition Strategy Report Acquisition, Technology andLogistics Acquisition andTechnology Basis of Estimate Capability Definition Document Critical Design Review Contract DataRequirements List Contract LineItem Number Configuration Management Capability Maturity Model Integration ContractingOfficer Concept of Operations CommercialofftheShelf Contractor PerformanceAssessment Report Capability Production Document Concept Refinementphase Contractor StatementOf Work Contract WorkBreakdown Structure DefenseAcquisition Guidebook DefenseContract Management Agency DefenseFederal Acquisition Regulation Supplement Department of Defense Developmental Test andEvaluation Deputy UnderSecretary of Defense EnterpriseDevelopment EarnedValueManagement EarnedValueManagement System Federal Acquisition Regulation Functional Configuration audit FamilyofSystems Governmentoff theShelf Human System Interface Information Assurance Information Technology IntegratedBaselineReview Initial Capabilities Document Independent Cost Estimation IntegratedDevelopment Environment IntegratedMaster Plan IntegratedMaster Schedule IntegratedProductTeam
ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

38

ISP ISR JCIDS KPP LSE MOSA MR M&C M&S OCI OPR OSD OTRR OT&E OUSD O&S PBSA PCA PM PRR PPIRS PSP PSS PWS P&D RFI RFP SDD SDOE SE SEIPT SEP SEMP SME SMR SOE SOO SoS SOW SPS SSA SSE SSEB SSP SW SVR

IntegratedSupportPlan In Service Review Joint Capabilities Integration andDevelopment System Key Performance Parameter Lead(or Chief) Systems Engineer ModularOpen Systems Approach Material Readiness Monitor andControl ModelingandSimulation Organizational Conflict of Interest Officeof Primary Responsibility Officeof theSecretary of Defense Operational Test Readiness review Operational Test andEvaluation Officeof theUnderSecretary of Defense(AT&L) Operations andSupportphase Performance BasedServices Acquisition Physical Configuration Audit Program or Project Manger Production Readiness Review Past Performance Information Retrieval System ProductSupport Plan ProductSupport Strategy Performance WorkStatement Production andDeployment phase Request for Information Request for Proposal System Development andDemonstration phase System Design andOperational Effectiveness Systems Engineering Systems EngineeringIPT Systems EngineeringPlan Systems EngineeringManagement Plan Subject Matter Expert Sustainment Material Readiness System Operational Effectiveness Statement of Objectives SystemofSystems Statement of Work System Performance Specification Source Selection Authority Systems andSoftwareEngineering Source Selection Evaluation Board Source Selection Plan Software System Verification Review
ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

39

TD TDS TEMP TES TLCSM TRA T&E WBS

Technology Development phase Technology Development Strategy Test andEvaluation MasterPlan Test andEvaluation Strategy Total LifeCycleSystem Management Technology Readiness Assessment TestandEvaluation WorkBreakdown Structure

ODUSD (A&T)Systems andSoftware Engineering/Enterprise Development ATLED@osd.mil

40

Вам также может понравиться