T H E   W H I T E   H O U S E

Evaluating Information Technology Investments

Help Site Map Text Only

OMB Home Page

 

Evaluating Information Technology Investments

Table of Contents


Introduction

In fiscal year 1996, executive agencies expect to obligate more than $26 billion for information technology (IT) investments and operations. This IT spending represents a critical investment of public tax dollars affecting virtually every government function. Creating a government that works better and costs less demands high returns on information technology (IT) investments and reduced systems development risks.

This guide sets out an analytical framework for linking IT investment decisions to strategic objectives and business plans in Federal organizations; it supplements existing OMB policies and procedures. OMB's objective is to provide information on 1) what OMB expects from agencies and 2) how agencies can reduce the risk and maximize the net benefits from their IT investments. The guide was written with assistance from GAO and is based on strategic information management practices in successful organizations.End Note 1, End Note 2

This guide describes the critical success elements and key phases that should be a part of a mature IT investment process. The IT investment process an agency designs should match the culture and organizational structure of the agency. The overriding objective is that senior managers be able to systemically maximize the benefits of IT investments through use of the IT investment process.

The investment process, depicted in Figure 1 below, consists of three phases: selection, control and evaluation. As Figure 1 indicates, the three phases of the investment process occur in a continuous cycle of selection, control, and evaluation. Information from each phase flows freely among all of the other phases with the exception of evaluation. The evaluation component of the process has a unidirectional information flow to the selection component. The evaluation component is used to verify or modify the criteria used during selection.

Select

-- create a portfolio of IT project investments that maximizes mission performance, using a standard set of criteria for consistent comparison of projects.

Control

-- measure ongoing IT projects against their projected costs, schedule, and benefits and take action to continue, modify, or cancel them.

Evaluate

-- 1) determine the actual return on investment of an implemented investment against the agency's mission and 2) adapt the existing process to reflect "lessons learned".

The control and evaluation phases are conducted throughout the year and their results are fed into the selection phase, which in turn feeds back to the control and evaluation phases.

This guide begins by identifying three attributes that characterize successful investment processes in best practice organizations. The guide then is organized by the phases of the investment process. Within each phase, the guide will describe: 1) the steps involved , 2) applicable management techniques and tools, 3) key questions to ask, and 4) examples from best practice organizations.


investments process diagram


ORGANIZATIONAL ATTRIBUTES FOR
SUCCESSFUL IT INVESTMENTS

While each phase of the investment process has its own requirements for successful implementation, there are some overall organizational attributes that are critical to successful investment evaluation. These shared, critical attributes are: senior management attention, overall mission focus, and a comprehensive portfolio approach to IT investment.

Critical Attribute #1: Senior management attention

Agency processes should include the following elements:

• Senior program managers, with authority to make key business and funding decisions on IT projects, are continuously involved in the process.

• A disciplined and structured management forum is used to make IT investment decisions, with the authority to approve, cancel, or delay projects, mitigate risks, and validate expected returns.

• Program, Information Resource Management (IRM) , and financial managers with clearly defined roles, responsibilities, and accountability for the success of IT projects. Mechanisms to achieve this include establishing service agreements between providers (IRM/Chief Financial Officer (CFO)) and consumers (line management) of information technology, incorporating IRM/CFO issues and requirements into program plans, and routinely involving the IRM/CFO offices in operational decisions.

Critical Attribute #2: Overall mission focus

Agency processes should:

• link strategic planning to the agency's mission goals and customer needs as required by the Government Performance and Results Act (GPRA) of 1993 (Public Law 103-62). This includes developing long-term general goals, setting specific annual performance targets, and annually evaluating actual performance against these targets.

• develop mission-related IT measures that link the IRM strategic plan with the agency strategic plan.End Note 3 For example, mission goals should be translated into objective, results-oriented measures of performance, both quantitative and qualitative, which can form the basis for measuring the impact of information technology investments.

• determine whether the function to be supported by the investment should be performed in the private sector rather than by an agency of the Federal government.

• determine whether the agency proposing to perform the function is the most appropriate agency.

• examine the work processes involved to ensure they are efficient, effective, and will take full advantage of the proposed automation.

• use mission benefit, not project completion on time and within budget, as an important measure of success for any IT project.

• identify all major existing or planned information systems and define their relationship to one another and to the agency's mission.

Critical Attribute #3: Comprehensive approach to IT investment

Agencies should:

• define a portfolio that includes IT projects in every phase (initial concept, new, ongoing, or fully operational) End Note 4 and for every type (mission critical, cross-functional, infrastructure, administrative, and R&D) End Note 5 of IT system.

• develop levels of review, documentation requirements, and selection criteria appropriate to the phase and type of IT system.

• define dollar thresholds that can be used to channel projects to the appropriate agency decision levels to best accommodate organization wide versus unit specific impact. Mostimportant is the use of a consistent set of investment decision practices throughout the agency. Some best practice organizations submit projects to thorough investment reviews when costs exceed between 0.5 and 2 percent of the organization's IT budget.

• develop criteria for identifying projects of a critical nature that fall below the dollar threshold but should be included in the investment review process.

Each attribute contributes to properly implementing the three phases of the investment process. Senior managers and those helping to install the investment process in each agency should keep these elements in mind during review of the details of the selection, control, and evaluation phases.


PHASES OF THE INVESTMENT CONTROL PROCESS

PHASE ONE: SELECTION

The selection phase creates a portfolio of IT project investments designed to improve overall organizational performance. This phase combines rigorous technical evaluations of project proposals with executive management business knowledge, direction, and priorities. Key to this phase is the use of uniform, consistent decision criteria that will allow agency executives to make comparisons of costs, benefits, risks, and returns across project proposals. The four step selection process is:

Step 1 -- screen IT project proposals;

Step 2 -- analyze risks, benefits, and costs;

Step 3 -- prioritize projects based on risk and return; and

Step 4 -- determine the right mix of projects and make the final cut.

Management Tools and Techniques Applicable to this Phase

• Executive management team that makes funding decisions based upon comparisons and tradeoffs among competing project proposals, especially for those projects expected to have organization-wide impact.

• Documented and defined decision criteria that examine expected return on investment (ROI), technical risks, improvement to program effectiveness, customer impact, project size and scope.

• Pre-defined thresholds and authority levels that recognize the need to channel project evaluations and decisions to appropriate management levels to accommodate unit-specific versus agency level needs.

• Minimal acceptable ROI hurdle rates for project approvals -- applicable to all organizational levels -- to minimize risks and increase returns.

• Risk assessments that expose potential technical and managerial weaknesses that could impair project success.

Step 1: Screen Project Proposals

IT proposals should be screened for the level of review as well as relevance and feasibility.

A mature investment screening process should prescribe the amount of documentation and level of analytical rigor depending on the project's type (i.e., mission critical, infrastructure, etc.) and phase (i.e., initial concept, new, ongoing and operational). For instance, when senior managers analyze initial concept proposals the questions and documentation would be different from that required for a project that is ready to be awarded and implemented.

Example: One best practice company required more documentation and greater analytical rigor if a proposal would replace or change an operational system vital to keeping the company running or if the concept matched a company-wide strategic goal. Lower-impact proposals that would only affect an office or had a non-strategic objective were not scrutinized in as much detail.

If a project proposal does not meet all the essential requirements necessary for its type and phase, it should be returned to the originating business unit sponsor indicating problems, issues, or documentation that needs further work or clarification.

Following are some of the questions that can be used to screen projects for relevancy to the agency's mission and for technical and organizational feasibility. If the answer to any of these questions is no, a project should not receive consideration and should be returned to the originating unit. Projects that meet these criteria should continue to Step 2 where more rigorous analysis is performed.

Key Questions to Consider in Screening a Proposal:
  • Is the project clearly relevant to mission priorities outlined in the agency's strategic or business plan?

  • Is the project feasible to design and execute given the agency's demonstrated capability to deliver?

  • Are there commercial off the shelf systems available to achieve the majority of the project's goals?

  • Has another agency done this type of a project before? If so, have lessons learned been incorporated into the project plan, and consideration given to using their system for the project's requirements?

  • Does the project conform to the agency's technology and information architecture?

  • Will the project be executed in well-defined stages, including decision points for continuing, modifying, or canceling the project?

Step 2: Analyze Project Risks, Benefits, and Costs

At this point, the proposals should be reduced to those with the highest potential to support the agency's critical mission and/or operations.

A detailed evaluation of each proposal's supporting analyses should be conducted and summarized so that senior management can begin examining tradeoffs among competing proposals that are to occur in the next step. At this stage, a technical review team should evaluate the soundness of the project's benefit-cost and risk analyses. In particular, the review team should examine how theproject is expected to improve program or operational performance and the performance measures that will be used to monitor expected versus actual results.

Example: One best practices organization required the project team to present not only the estimated return on investment (ROI), but also the specific assumptions underlying their analysis, why such assumptions are appropriate under these circumstances, and any differences from assumptions used to calculate ROI for comparable projects in the past.
Example: In another best practices organization, qualified staff reviewed and scored all projects using risk criteria before the projects were reviewed for approval by top managers. The top managers considered these risk scores in their decision making process. Risk elements were reported in five categories: security, user and customer impact, system (project) impact, dollar impact, and complexity. Within each category, applicable elements were given a numeric score from 1 (lowest risk) to 5 (highest risk). Under security, for example, elements included the classification levels of information to be processed, how programs and files were protected by security software, and what access controls were to be in place. Total scores from the individual elements in each category were weighted, based upon an agreed upon formula, to reflect the organization's priorities. Weighted scores were included in the top managers' review packages.
Key Risk Questions to Consider:
  • Has the relevant agency group successfully managed previous IT investments of similar risk and complexity?

  • Has the project team assessed project risk (e.g., unusual technical requirements or system complexity) using a comprehensive, well understood and documented process?

  • Has a sensitivity analysis been performed for key variables?

  • For higher risk projects, does the proposal explain how specific risk factors will be continuously monitored to minimize exposure?

  • What are the risks to program operations and customer service if this project does not proceed?

Key Benefit Questions to Consider:
  • Have the benefit estimates been validated or approved by users?

  • Has the project team prepared a benefit-cost analysis for the investment that:

    • 1) relies on systematic measures of mission performance,

    • 2) is consistent with OMB Circular A-94 "Guidelines and Discount Rates for Benefit-Cost Analysis of Federal Programs," and

    • 3) is at a level of detail appropriate to its size?

  • What are the constraints and assumptions that may affect the costs and benefits of alternative solutions?

  • Does the justification for the investment proposal depend on projected benefits that occur more than 5 years in the future? If so, what is the level of confidence in those benefits estimates? End Note 6

  • Is an IT investment considered an infrastructure project that makes future projects possible? If so, how does the benefit-cost analysis account for expected payoffs from future investments?

  • Do the assumptions supporting the analysis accurately reflect market conditions where commercially available software and hardware costs are declining each year? Are agency cost assumptions based on today's prices or prices expected at the time of budget execution?

  • Are quantitative and qualitative benefits clearly expressed in mission or program improvement terms (e.g., changes in quality, cost, speed, accuracy, or productivity)?

  • Is it possible to share the costs of the project across different organizational units with similar needs?

Step 3: Prioritize Projects Based on Risk and Return

During this phase, IT projects are rigorously compared against one another to create a prioritized list of all investments under consideration.

After completing analysis, the agency should develop a ranked listing of information technology projects. This listing should use expected risks and benefits to identify candidate projects with the greatest chances of effectively and efficiently supporting key mission objectives within given budget constraints. End Note 7

One approach to devising a ranked listing of projects is to use a scoring mechanism that provides a range of values associated with project strengths and weaknesses for risk and return issues. Table 1, below, shows an example of how individual risk and return factors might be scored. This example is a hybrid table drawn from multiple best practices organizations. Higher scores are given to projects that meet or exceed positive aspects of the decision criteria. Additionally, in this example, weights have been attached to criteria to reflect their relative importance in the decision process. In order to ensure consistency, each of the decision criteria should have operational definitions based on quantitative or qualitative measures.


Table 1: Example of Decision Criteria and Scoring Process Used to Rank IT Projects

IT Project (1 thru n)

Weight

DECISION CRITERIA SCORING

PERCENT

Overall Risk Factors




Weights for Risks
SUM=100%

Investment Size - How large is the proposed technology investment, especially in comparison to the overall IT budget?

1__________5__________10

Large              Small

40

Project Longevity - Do projects adopt a modular approach that combines controlled systems development with rapid prototyping techniques? Are projects as narrow in scope and brief in duration as possible to reduce risk by identifying problems early and focusing on projected versus realized results.

1__________5__________10

Non-modular      Modular

30

Technical Risk - How will proposed technology be integrated into existing systems? Will proposed investment take advantage of Commercial Off-The-Shelf (COTS) software and systems? How will the complexity of the systems architecture and software design affect the development of the project?

1__________5__________10

Experimental Established

Custom          Industry

                Standard

30

Overall Return Factors




Weights for Returns
SUM=100%
Business Impact or Mission Effectiveness - How will the technology investment contribute toward improvement in organizational performance in specific outcome-oriented terms?

1__________5__________10

Low                 High

25

Customer Needs - How well does the technology investment address identified internal and/or external customer needs and demands for increased service quality and timeliness or reductions in costs?

1__________5__________10

Low                  High

15

Return on Investment - Are the return on investment figures using benefit-cost analysis thresholds reliable and technically sound?

1__________5__________10

Risky              Known

estimates        benefit

20

Organizational Impact - How broadly will the technology investment affect the organization (i.e., the number of offices, users, work processes, and other systems)?

1__________5__________10

Low                 High

25

Expected Improvement - Is the proposed investment being used to support, maintain, or enhance existing operational systems and processes (tactical) or designed to improve future capability (strategic)? Are any projects required by law, court ruling, Presidential directive, etc.? Is the project required to maintain critical operations--payroll, beneficiary checks, human safety, etc.--at a minimal operating level? What is the expected magnitude of the performance improvement expected from the technology investment?

1__________5__________10

Tactical:      Strategic:

Improves        Provides

existing          new

process       capability

15

Total Risk Adjusted Score = Weighted Sum of Overall Risk Factors +
Weighted Sum of Overall Return Factors








A scoring and ranking process such as the one depicted in Table 1 may be used more than once and in more than just this step to "winnow" the number of projects that will be considered by an executive decision-making body down to the best possible choice.

An outcome of such a ranking process might produce three groups of projects:

  • Likely winners -- One group, typically small, is a set of projects with high returns and low risk that are likely "winners."

  • Likely drop-outs -- At the opposite end of the spectrum, a group of high risk, low return projects usually develops that would have little chance of making the final cut.

  • Projects that warrant a closer look -- In the middle is usually the largest group. These projects have either a high return/high risk or a low return/low risk profile. Analytical and decision-making energy should be focused on prioritizing these projects in the middle group, where decisions will be more difficult to make.

At the end of this step, senior managers should have a prioritized list of IT projects and proposals with supporting documentation and analysis.

Step 4: Determine the Right Mix of Projects and Make the Final Cut

During this phase, an executive level decision making body determines which projects will be funded based on the analyses completed in the previous steps.

Determining the right mix of projects to fund is ultimately a management decision that considers the technical soundness of projects, their contribution to mission needs, performance improvement priorities, and overall funding levels that will be allocated to information technology.

Senior management should consider the following balancing factors when arriving at a final resource allocation and project mix.

  • Strategic improvements vs. maintenance of current operations

    Efforts to modernize programs and improve their mission performance may require significant investments in new information systems. Agencies also have operational systems on which the agencies depend to operate their programs as currently structured. These older systems may need to be maintained. A balance should be struck between continuing to invest in older systems and modernizing or replacing them. It may be helpful to track over time the percentage of funding spent on strategic/development vs. maintenance/operations projects.

  • New projects vs. ongoing projects

    The senior managers who choose the final mix of projects to be funded must periodically re-examine projects that have already been approved to ensure that they should still be supported. There may be concerns about a project's implementation, such as greater-than-expected delays, cost overruns, or failures to provide promised benefits. If new projects are more consistent with an agency's strategic initiatives, offer greater benefits for equivalent cost, or present fewer risks, the old projects may need to be canceled.

  • High vs. low risk

    If a portfolio is managed only to minimize risk, senior management may unnecessarily constrain an agency's ability to achieve results. High risk, high return projects can significantly enhance the value to the public of an agency's IT spending, provided the agency has the capability and carefully manages the risks. Most organizations, however, can only handle a limited number of such projects. As a result, senior management must consciously help balance the amount of risk in the portfolio against the agency's capabilities and ability to manage risk.

  • Impact of one project on another

    Now that federal agencies are trying to integrate their systems, every new project proposal is likely to affect, or be affected by, other project proposals, ongoing projects, or current systems. Senior management must recognize the context in which the new project will be placed and make decisions accordingly. For example, one best practice company has established as a risk the number of dependencies between a new project and other projects/systems.

  • Other complicating factors

    Other complicating factors can heavily influence how senior management makes a final cut for approved IT projects.

  • Opportunity costs

    Consider the impact on long range investment opportunities if all of the current projects are funded. Will large current costs preclude or delay better future opportunities? Will large current capital expenditures create even larger maintenance costs in the future?

  • External funding

    IT projects sometimes rely on funding and resources from outside agencies or private organizations. If any project under consideration requires critical components from outside the agency, then the value of the agency's investment may be lost if the commitment by the outside party later shifts.

  • Budget constraints

    How much does the agency have available for IT investments for this budget year and for the next several years? Besides budget year spending levels and out-year estimates for the agency, the analysis should examine if there are other sources of funding for the projects. The agency should identify these other sources in its investment proposals.

    What projects will fit under the spending levels this budget year and in out-years? Senior management can take the final list of projects with their associated costs and determine which projects fit within the spending parameters this budget year and/or in out-years. A project may have a relatively high priority, but resource constraints may preclude funding it this budget year. Senior management can then decide that the project be approved, but that its start date be delayed until funds are available, assuming it still matches the agency priority needs in the coming years.

After consideration of all of the factors mentioned above, senior management should have enough information to make knowledgeable investment decisions. Senior management should also designate how many times a project is to be reviewed based on the level of risk and any steps that the project team must take to mitigate that risk. For example, one best practices organization requires that senior management only approve projects after a review schedule has been established, (e.g., reviewed once a month for high risk, or once a quarter for lower risk), and specific requirements have been given to the project team to ensure that they mitigate risks, (e.g., develop a risk management plan).

Project review schedules, risk mitigation plans and the cost-benefit plans from prior steps all feed directly into the next section of the investment process -- control.

PHASE TWO: CONTROL

While agencies select proposals once a year, the control phase is an ongoing activity to review new and ongoing projects, as well as operational systems. During the control phase, senior management regularly monitors the progress of ongoing IT projects against projected cost, schedule, performance and delivered benefits. The frequency of the reviews may vary, but should not wait until the annual budget preparation and deliberation process. How often and to what extent individual projects should be reviewed should have been established as the last step in the Selection phase. Rather than avoiding problems and concerns emerging from unexpected risks, this phase accentuates the need for management accountability by creating pre-arranged checkpoints for projects and forcing corrective action when necessary. If a project is late, over cost, or not being developed according to expectations, then senior management must decide whether to continue, modify, or cancel it. The steps in this phase are to:

Step 1 --monitor projects/systems against projected costs, schedule, and performance; and

Step 2 -- take action to correct any deficiencies.

Management Tools and Techniques Applicable to this Phase

  • Establish processes to involve senior management in ongoing reviews and force decisive action steps to solve problems early in the project.

  • Define explicit measures and data used to monitor expected versus actual project outcomes on cost, schedule, and performance which are consistently maintained throughout the organization and readily accessible via automated management information systems.

  • Create positive incentives for raising real and potential project problems for management attention and action.

Before an organization can fully implement the control steps, uniform mechanisms for collecting, automating, and processing data on expected versus actual costs, schedules, and returns should be in place for all projects. End Note 8

Example: One best practice company has developed a database which stores risk-based assessment data about ongoing IT projects. The company uses RED, YELLOW, and GREEN symbols to evaluate each project on several dimensions; including quality of deliverables, conformance with company project development processes, and technical feasibility. For example, a project that has a YELLOW symbol on the deliverables dimension would mean that the company is concerned the expected deliverable will not meet needs, and that minor improvement is required. An overall assessment symbol is applied to projects as well. Projects with a RED symbol mean that there is a least one RED symbol, or 3 YELLOW symbols, attached to it. The database provides executive management with an easily accessible tool for identifying risks by type or severity.

Step 1: Monitoring Projects/Systems Against Projected Costs , Schedule, and Performance

Senior managers need to compare the preliminary results being achieved by a project against its projected costs, benefits and risks, and to identify actual or potential managerial, organizational, or technical problems.

Senior management should be able to judge whether a project is on track to achieve its projected mission benefits. The key is to use a set of performance measures consistently so that senior program managers are provided early warning of potential or actual problems. It is essential torefresh these measures as costs, benefits, and risks become better known to ensure the continued viability of an information system prior to and during implementation.

Examples of problems that could affect a project or system include 1) lack of input by program management into the requirements phase of a project, 2) a project that was intended to be cross-functional becomes stove-piped because other offices in the agency do not support it, 3) new requirements have been added, and 4) it is more difficult to use the technology than was anticipated.

Senior program managers in federal agencies often pay most of their attention to new projects and carry ongoing projects as necessary budget items. In best practice organizations, however, ongoing projects are reviewed continually along with new projects and go/no-go decisions are made. No project should be allowed to continue indefinitely through failure. Project continuance should be periodically challenged.

Based on a schedule developed during the selection phase, each project/system should be reviewed with at least the following considerations in mind:

  • How do current costs compare against projected costs?

  • How does the current schedule compare against the projected schedule?

  • How does the current performance of the deliverables compare against projected measures?

  • If we were starting over, would we fund this proposal today?

  • Have new requirements "crept" into the project?

  • Have business conditions changed significantly since the project was approved?

  • Is the project still technically feasible?

  • Is the project dependent on other projects? Are they late?

  • Does the project still support the architecture?

  • Is the project necessary for the successful completion of other projects?

Senior program management should be able to develop a well-informed picture of current and potential problems for each ongoing IT project.

Step 2: Taking Action to Correct Deficiencies

The action should result in the deliberate continuation, modification, or cancellation of each project.

The prior step, pertaining to monitoring of projects, should pinpoint projects that senior management need to make decisions on. What action to take is a management decision.

Senior management should ensure that:

  • The solution to problems should not be the sole province of the IRM organization. Even when senior management is aware of problems with projects or systems, the solution to the problem is too often left with the information systems organization. Senior managers should ensure that program officials are involved in the solution, since in many instances it may be the business side of the organization which provides a solution.

  • All management decisions are documented along with data supporting the required changes. Common problems and their solutions, which are applicable to one IT project, should be evaluated as to how they apply to other IT projects under management's purview. To avoid replication of effort for analysis, documentation of management decisions is critical. Federal agencies often treat each budget year as isolated and provide funding for whatever can be supported each year rather than evaluating the IT projects with a historical perspective. By contrast, leading organizations revise their selection processes and IT funding decisions based upon the outcomes produced from the previous year.

To use an example, many federal agencies are prototyping IT projects before moving into the implementation stage. Monitoring the mission results gained by the prototype allows senior program management to make informed decisions about whether to stop or modify a project at this stage, rather than letting the project continue on into implementation automatically.

Proper control of IT investments enables senior management to mitigate risk of schedule, cost overruns, and development of a product that does not meet the goals originally intended. This process is highly dependent on facts provided through continual measurement of new and ongoing projects. The data fed from the Selection process to the Control process supports this requirement, as do the measurements taken throughout the life of a project.

PHASE THREE: EVALUATION

Evaluation is conducted after a system has been implemented, and is an assessment of the project's success or failure. Using post implementation reviews, data is collected, recorded, and analyzed to compare expected results against actual benefits and returns. Figure 1, shown previously, depicts the evaluation phase in relation to the other two phases. Evaluation is used to 1) decide whether future changes are necessary which can help address serious performance gaps, and 2) make decisions about modifications to the organization's existing evaluation process and selection criteria. This phase is comprised of three steps:

Step 1 -- Conduct Post Implementation Reviews

Step 2 -- Decide on Adjustments

Step 3 -- Lessons Learned

Management Tools and Techniques Applicable to this Phase

  • Post implementation reviews to determine actual project cost, benefits, risks, and returns.

  • Maintaining accountability for project performance and success based on quantifiable measures to create incentives for strong project management and senior management ownership.

  • Modification of selection decision criteria and investment control processes as needed to ensure continual improvement based on lessons learned.

Step 1: Conduct Post Implementation Reviews

Conduct and review the results of post implementation reviews, focusing on anticipated versus actual results in terms of cost, schedule, performance, and mission improvement outcomes. Determine the causes of major differences between plans and end results.

Most federal agencies accept that recently implemented systems are a fait accompli and move on from there. This point of view is contrary to the investment management philosophy of managing the entire IT portfolio. The primary tool to assess a project in best practice organizations is the post-implementation review. Questions to ask include:

  • How effective was the project in meeting the original objectives?

  • How well did the project meet the planned implementation dates?

  • What mission benefits has the project achieved, and do they match the benefits projected? If not, why not?

  • Were the original business assumptions that justified the system valid?

  • What lessons did the team learn from this project?

The post-implementation review should inform senior management's decision whether to continue, modify, or cancel operational systems.

Step 2: Decide on Adjustments

Using the results of the post implementation review as a baseline, decide whether to continue without adjustment, to modify the system to improve performance or, if necessary, to consider alternatives to the implemented system.

Even with the best system development process, it is quite possible that a new system will have problems or even major flaws that must be taken care of in order for the agency to get the full benefit of its investment. The post implementation review should provide executive management withuseful information on how best to modify a system, or to work around the flaws in a system, in order to improve performance and to bring the system further in alignment with the needs of its customers.

Step 3: Lessons Learned

Using the collective results of post implementation reviews across completed systems, modify the organization's existing investment selection and control processes based on lessons learned.

The information from post implementation reviews helps senior management develop better decision criteria during the Selection process and improve the evaluation of ongoing projects during the Control process.

Example: After several post implementation reviews of several completed projects, one best practice organization found that it was only realizing a 9 percent return on the projected benefits of its information systems investments. This focused senior management attention on more rigorous and realistic assessments of benefits projections presented during the selection cycle of their investment decision making process. Cost and benefit estimation techniques were improved, based upon quantitative data associated with past systems development efforts. Low value and high risk projects became more readily identifiable in the investment selection and control processes. Within two years, this company saw IT benefits exceed initial projections by some 33 percent.


CONCLUSION

A mature investment process will help ensure that taxpayer dollars spent on information technology will be used to effectively support the agency's mission objectives. Dwindling resources and higher public demand for service means that a project must be worth doing from a mission perspective, it must be possible to accomplish it at reasonable time and cost, and it must support the strategic direction of the agency.

A mature investment process requires discipline, executive management involvement, accountability, and focus on risks and returns using quantifiable measures. Senior program managers, those with the programmatic responsibility in key business areas, should be involved directly in prioritizing and selecting the IT projects their organization will pursue. Their decisions should be well-informed, based on analytical rigor and robust measures. Furthermore, a mature investment process is a year-round activity, not just a process to be done near budget time. Senior program managers should be involved in devising and enforcing solutions to the problems that inevitably arise. Finally, the mature investment process is a learning process. The real-world results of IT projects and mission programs should be continuously fed back to senior managers as they make decisions on new projects and operational systems.


End Notes

  1. The foundation for the IT investment process and most of the examples cited in this guide are based upon case study research of leading private and public sector organizations conducted by the United States General Accounting Office (GAO) and summarized in its report entitled, Executive Guide: Improving Mission Performance through Strategic Information Management and Technology (GAO/AIMD-94-115, May 1994).

  2. For more information on IT investment call at GAO, David McClure or Alicia Wright (202-512-6406), at OMB, Jasmeet Seehra (202-395-3785).

  3. OMB Circular No. A-11, Transmittal Memorandum No. 66, Supplement No. 1 (September 14, 1995), "Preparation and Submission of Strategic Plans," provides detailed guidance on the preparation of agency strategic plans. In particular, Section 200.10, "Description of how the general goals and objectives will be achieved," calls for a description of how the agency will use "technolog[y]...information, and other resources" to support the agency's mission goals and objectives, thus establishing the basis for a linkage between the IT investment process and the agency's strategic plan.

  4. Initial concepts - A project idea that has a sponsor and mission relevance but has not had formal cost/benefit, alternative, or requirements analyses; New - A project that has had formal cost/benefit, alternative, or requirements analysis completed, but has not been awarded; Ongoing - A project that has been awarded but has not been completely implemented (this also includes pilots and prototypes); and Operational - Systems that are completely implemented (this includes legacy systems).

  5. Mission critical - Systems essential to the completion of an agency's mission (sometimes known as program specific); Cross-functional - Systems that cut across more than one program or mission in an agency; Infrastructure - Enabling technologies that are essential to run other types of systems (may include networks, telecommunications, etc.); Administrative - Systems that are operational in nature, but not mission specific (such as accounting and financial systems); R&D - Projects or systems that are "state of the art" either in concept or technology that could reinvent the agencies processes or mission. Embedded systems are not included because they are considered to be parts of larger systems, such as an aircraft or rapid transit system.

  6. Expected benefits accruing from multi-year information technology investments are difficult to precisely quantify and are often subject to change; the reference to the five year time frame is offered as a guide.

  7. Mission and management objectives may be prioritized in an agency strategic plan. Information Systems Plans (ISPs) may also be in place that are clearly linked to the agency's strategic plan and have taken into consideration that portion of the budget available for IT spending. In such cases, some level of IT prioritization may already be completed. However, this step should still be used to confirm these linkages and evaluate risks and returns across projects throughout the entire organization.

  8. The Federal Acquisition Streamlining Act of 1994 (P.L. 103-355), Subtitle B, Title V, requires agencies to establish and track major acquisitions against cost, schedule, and performance goals. OMB Bulletin No. 95-03, "Planning and Budgeting for the Acquisition of Fixed Assets" (June 27, 1995), prescribes reporting requirements consistent with that Act. These measures can be designed for use in the Selection and Control phases of the investment management process.


The Budget  |  Legislative Information  |  Management Reform/GPRA
Grants Management
Financial Management  |  Procurement Policy  |  Information & Regulatory Policy
Contact the White House Web Master

Privacy Statement