The guideline for scoring on market knowledge is as follows: Scores: 0. The meaning of each guiding point is elaborated on below, however, it should be noted that these are indistinct meanings and can be concretely defined only for a given context and organisation: Initial: Only a faint touch of market knowledge needed for implementation of the business idea.
One example of this could be the formulation of a new pricing model for a telecom by a telecom provider. Here, the market knowledge used is of minimum level. Basic: Uses the market fundamentals for implementing the business idea. An improvisation of usage of an existing technique could be an example for this. Advanced: Uses advanced market knowledge for implementing the business idea.
Expert: Uses front-end technologies for implementing the idea. In some cases, it could mean discoveries or new concepts in the market for the implementation of the business idea. An example of this is the search engine algorithm for the Google search paradigm.
Basic: When the innovation is there, but of a limited nature it would be categorised in this section.
A classic example of this could be a user interface change in existing software to make it easier to use or to make it more effective. Incremental: For innovation that is substantial, but does not open up completely new opportunities. One example of this could be the use of the "undo" button on various software systems.
Breakthrough: This type of innovation could bring in a complete change in terms of opportunities available. A good example of this would be a drug curing cancer.
Advanced: A business idea with considerable use of technology would be categorised here. One example of this could be fly-by-wire technology for aircraft operation. Cutting edge: When the technological need becomes crucial for the success of the business idea that it needs to use fore-front technologies. One example is the recently invented cyber knife for treating cancer patients.
He has several international publications to his credit, and is a reviewer for ACM Computing Surveys. Contact him on Email: shantanu justawordaway. Read similar articles. Disrupt Less, Innovate More A lot of leaders talk about their business or industry being disrupted, and they want to know how to deal with all the 'disruption'. The Three Elements Of Successful Innovation One of the most commonly-quoted stories around disruption is about Kodak, which was the market leader in film photography, but failed because it couldn't adapt to digital photography.
Expert Talk. Featured Articles. Consult with the references [1, 2, 3, 4] for each service tool at the end of this paper to better understand how metrics are scored. Best Practices and Lessons Learned: Determining the value of each metric is the responsibility of the acquisition program team.
System engineering inputs are relevant to most of the reporting items; some are more obvious than others. At the outset of an acquisition, a risk management process should be in place see the Risk Management section within this guide ; the ability of this process to adequately identify and track risks is a major component of the PoPS tool.
All technical risks should be incorporated in this assessment, including those that may be included in the technical maturity assessment.
Immature technology can be a considerable risk to program success if not managed appropriately; it also can be scheduled for insertion into the program delivery schedule upon maturation.
For more detail on technology maturity, see the article Assessing Technical Maturity in this section. Note: This structure is generic and meant to closely represent what the services capture in their respective PoPS tools and where. Although a metric name may be different or absent when comparing one tool to another, same or similar qualities may be captured in a different metric. Conversely, metrics may have the same or similar name but capture different qualities of the program.
Refer to the individual service's PoPS operations guide for details [1, 2, 3, 4]. A subset of program management metrics is specific to contractor earned value. Although EVM is mostly considered a monitoring tool for measuring project performance and progress, it is also a planning tool.
Using EVM effectively requires the ability to define, schedule, and budget the entire body of work from the ground up. Best Practices and Lessons Learned: Fundamental to earned value is linking cost and schedule to work performed. However, work performed is often specified at too high a level to identify problems early. This is linked back to the generation of the WBS during the initial program planning and whether it was created at a detailed enough level i. In cases where the detail is insufficient, EVM is unlikely to report real problems for several months.
It is usually program engineers and acquisition analysts who are able to identify and report technical and schedule problems before the EVM can report them. TPMs are metrics that track key attributes of the design to monitor progress toward meeting requirements [7, 8]. As the team explored program performance successful and otherwise it became apparent that ultimate programmatic success depended on more than just successful management of cost, performance and schedule risk. The correct set of instructions to be used by a particular program depends on the current life cycle of the program, as depicted by Figure 1.
Programs in the planning phase1 should use the Planning instructions provided in Chapter 3 1 See introduction to Chapter 3 for definition of what constitutes the planning phase. Programs prior to Milestone B as defined in DoD Programs which are beyond Milestone C, but not yet in sustainment, should use the instructions of Chapter 6, which begins on page Finally, programs in sustainment should use the instructions of Chapter 7, which begins on page For major modification programs or upgrade of capabilities to a system in sustainment, the modification or upgrade program will go through the appropriate acquisition phase within this document: planning, pre-milestone B, post-milestone B, or post- milestone C.
Figure 2 outlines the acquisition phase documentation that will be referenced throughout this Operations Guide. However, for space programs, KDP C occurs while the system design is still being performed. This is considerably earlier than Milestone C for corresponding DoD programs. Similarly, the guide has been written to be as widely applicable as possible for Air Force programs. Space users will need to keep this in mind when reading the guide, particularly regarding sustainment functions.
Acquisition Phase Documentation Notes: 1. Summarized in Acquisition Strategy 5. Program initiation for ships 7. Milestone C if equivalent to FRP 2. Milestone C if no Milestone B 3. MAIS whenever an economic analysis 4.
It reflects all of the Level 1 factors and Level 2 metrics. The numeric value is a whole number unless otherwise indicated e. As a general rule, the status color and numeric for each metric should be based on the worse-case sub-rating for that metric e.
The process is designed to allow the leadership to quickly focus on specific areas of interest. Input frequency is correlated to key events in the annual PPBS cycle. Major program events such as a re-baseline requires a demand update for all elements.
In the interim, a MS ExcelTM spreadsheet has been created with features which allow a single spreadsheet to be used for programs in any life cycle phase. The spreadsheet also allows archiving of PoPS evaluations from month to month providing a historical record of scores and rationale for score assignment.
Once the spreadsheet is initialized for a particular program, the only tab used from month to month will be the Metrics tab. The Criteria tab is used for initialization. Once the Criteria tab is set, the Metrics tab is used to score the program. Note: The Steps to use the spreadsheet are described below.
The first 4 steps are required to initialize the spreadsheet the first time a program is scored. Subsequent evaluations should begin at Step 5. This data will be automatically transferred to the Windshield Chart during Step 7. Step 3: The user should determine if any points need to be reallocated from metrics which are not being used to other metrics which have been identified as eligible to receive reallocated points.
If points will not be reallocated, skip Step 2 and proceed to Step 3. If points will be reallocated, the user should again use the Criteria tab to adjust the criteria for color coding the metrics OR manually adjust the color coding based on the operations guide. Step 4: Click on the Metrics tab. A pop-up menu will be revealed at the right side of the cell. When this is highlighted, the menu offers a selection of life cycle phases. When the user selects the proper phase for the program and clicks on a cell, the spreadsheet will auto-populate with the correct factor and metric titles and point values.
This needs to be done only the first time the spreadsheet is used unless a program proceeds into another life cycle phase. Step 5: Now the user is ready to evaluate the program. For the first time evaluation, the metric colors should be initialized to Green or Gray for metrics which are NA. This will initialize the metric colors to Green and the scores to the maximum available. Scoring of individual metrics can now proceed. The Factors will automatically reflect the sum of the metric scores.
Where a KB is entered in a metric cell for Killer Blow, the cell is colored Red and assigned a zero value. However, the point value A summary of evaluation scoring criteria has been embedded in the Factor and Metric cells and can be revealed by placing the cursor over the cell.
However, detailed evaluation criteria contained in this Operations Guide should be referenced for completeness. Step 6: Once the program is scored and the rationale for the score is recorded in the Comments column, the data is ready to be archived. To archive the Metric data, click on the Archive Data button at the top right of the Metrics spreadsheet.
To archive Summary data, perform the same function at the top right of the Summary spreadsheet. Each month the current Summary and Metrics evaluations will be displayed on the A-F columns in the Archives tabs and the previous evaluations will be moved to the right.
Step 7: To automatically create the Windshield Chart, click on the Create Powerpoint Slide button at the top right of the Metrics spreadsheet. Program details are provided when you drill down on the PoPS score. Figure 3. Manual Upload of Windshield Chart 2. Add a Title Enter a description optional 5. After file selected, hit save Browse for the. Planning PoPS should be used on efforts which have not received formal approval, funding, or direction yet.
Once a formal Program Decision is received or contract is awarded, the program should be evaluated based on the criteria associated with the life cycle phase appropriate for the approved program; e. While planning a new effort, it is important the program team properly plan for all aspects of the program leaving little to chance.
Aspects of the program plan that may lead to execution risk should be highlighted as much as possible in order for the decision makers to understand any risk that may exist in the proposed plan. Program Planning Windshield Chart Example 3. Additionally, PMs can add parameters not currently listed. The criteria below should be consistent with the ACAT level of the program being evaluated.
All major U. This is not applicable for planning programs that will transition to Pre-MS B. Changes to requirements are negligible from month to month — any changes are leading to less risk in execution. Two or more requirements changes have occurred over the last three months leading to increased risk of achieving a stable low risk baseline.
0コメント