The Business Forum

"It is impossible for ideas to compete in the marketplace if no forum for
  their presentation is provided or available." �        �Thomas Mann, 1896


An Outsourcing Internal Control Methodology for Information Systems

Authors: L. Jane Park, Ph.D., CPA., Professor of Accounting
and Paul H. Rosenthal, PhD, Professor of Information Systems
Contributed by California State University, Los Angeles

 

 

Introduction

Industry and government has a long tradition of purchasing and subcontracting for products and services.  This type of purchasing and subcontracting is currently called sourcing or outsourcing. The Health Insurance Portability and Accountability Act of 1996 (HIPAA) regulations in the health care industry and the Sarbanes-Oxley Act�s Section 404, requires the management assessment and audit of all public companies Internal Controls as an integrated part of their financial audit [AICPA and HIPAA].  Following from these regulations, the AICPA�s Professional Ethics Executive Committee is exploring issues surrounding outsourcing to third-party providers.  Outsourcing control methodologies are therefore becoming an essential element of organizations required internal controls.

This paper will present a proven outsourcing internal control methodology that has been used for several decades in the information technology arena, since the primary functions of a modern Information Systems organization, except for strategic planning, can be either performed in-house or outsourced to development, processing, networking or consulting providers/vendors.  The evaluation of these providers/vendors is usually based on some type of cost-value analysis to rank and select providers.  A basic method for such cost-value analysis is the computation of a worth index.  Since almost all outsourcing proposals are required to provide a technical and managerial proposal and a separate cost proposal, the worth index is computed as:

                Worth Index = (α*Technical Score + β*Managerial Score) / Life Cycle Cost

This paper includes a proven methodology for computing the technical score, managerial score and life cycle costs for a Worth Index using both RFP and RFQ approaches.  The Worth Index methodology presented in this paper is applicable to functional sourcing opportunities in six IS/IT areas: the full IS organization (excluding strategic planning), IS development projects, IS data center production, IS technical support, tele�communications, and architecture planning support. These functional sourcing opportunity areas exist at both the enterprise and department/ workgroup levels.

Quantitative Evaluation Methodology

A fabricated comparison, based on several actual selection projects, between an in-house and three external potential vendors of an applications software package will be used to illustrate this papers proposed quantitative worth-index based process.  The quantitative evaluation process is diagrammed in the following model. 

Worth Index Computation Process


This paper will first discuss an overall methodology for sourcing evaluation based on popular RFP/RFQ approaches, second present the qualitative method used for the numerator, third a quantitative approach for the denominator, and fourth present some interesting approaches for presentation of the results of the analysis.
 

RFP/RFQ Methodologies

 

The process shown in the prior chart and described in the following sections of this paper are RFP based.  Equal weight is placed on quality and cost.   Most advanced systems and products use this RFP methodology. 

 

The RFQ approach shown in the following typical chart is used for projects with well known methodologies such as construction and sometimes consulting or shrink-wrapped package selection.  The RFQ method uses a Fail/Pass approach to evaluation of technical and managerial factors.  The sealed cost proposals are then opened for only the highly qualified bidders.  The low bidder is then the winner.  Note the Minimum Score column in the European Union procurement evaluation form following (The European Commission 2001).  That column will not be used in the detailed IS example in this paper, since it is primarily RFQ oriented.  

A Typical Management and Technical Proposal Evaluation Form

(EU �User-Friendly Information Society� Project

 

 

 

 

    Vendors Score*

 

Evaluation Criteria

Weight

Minimum Score*

In-house

Startup

Large

Specialized

Scientific/technological excellence, Innovation

4

3

 

 

 

 

Community added value and contribution

2

2

 

 

 

 

Contribution to Community Social Objectives

1

 

 

 

 

 

Economic Development and S&T prospects

3

3

 

 

 

 

Resources, Partnership and Management

2

2

 

 

 

 

 

Weighted Score**

 

 

 

 

   ** SUM(Weight X Score)

 

 

 

 

 

 

Each vendor is evaluated and scored.  The scores are multiplied by the weights and summed to give a weighted total score for those vendors that exceed all minimums.  The cost proposals are then opened and the lowest bidder is awarded the contract (Halifax 2000).  Note: This method does not work for high technology projects and one-of-a-kind projects or products. 

 

The RFP oriented methodology that follows in this paper, therefore applies to most Information systems projects.  The following chart outlines the procedure for computing the worth index.  A management proposal evaluation team and a technical proposal evaluation team are formed at the time of RFP issuance.  They follow the evaluation methodology in the next sections of this paper.  The weighting of each proposal (α and β) is normally specified by executive management. 

 

Deriving Technical and Managerial Scores and Life Cycle Costs 

A four step process for deriving the worth index follows. 

Step 1: Define Value of Specific Evaluation Criteria 

The specific criteria used in this illustration include: 

  • Functionality
     

  • Package capability related to functional requirements as a percentage of a perfect match. 
     

  • Platform Utilization
     

  • The forecasted utilization of a current processing platform as a percentage of maximum feasible capacity. 
     

  • Survival Probability
     

  • The forecasted probability, shown as a percentage, that the vendor package will maintain or expand its share of market over the planning horizon of this application. 
     

  • Initial Cost
     

  • Front end cost in $ of software, support, training, conversion and taxes. 
     

  • Annual Cost
     

  • Continuing costs in $ of maintenance and support. 
     

  • Annual Benefits
     

  • Estimated cost reductions or profit increases in $ due to converting to the new system.

More details on scoring these criteria can be found in the following section � �Sourcing Evaluation Criteria�.

A typical result of the application of this step is shown in the following table.

 

Multi-Product

Vendor - A

Specialized

Vendor - B

Start up

Vendor - C

In-house

Development

Qualitative Criteria

Functionality

Platform Utilization

Survival Probability

 

Quantitative Criteria

Initial Cost (000)

Annual Cost (000)

Annual Benefits (000)

 

70%

30%

90%

 

 

$300

$100

$200

 

90%

40%

80%

 

 

$400

$100

$250

 

100%

 40%

 30%

 

 

$400

$100

$280

 

 

100%

 40%

100%

 

 

$800

$150

$280

 

Step 2:  Compute Life Cycle Costs and ROI

Computing a return on investment (ROI), requires (in addition to initial and continuing costs), an estimated life of the project[1].  Currently many investments in applications software involve a planning horizon that is twice the platforms technology cycle, while most investments in platform alternatives involve a single technology cycle planning horizon.

Therefore assuming a ten year planning horizon (twice the mainframe five year technology cycle) with no adjustment for inflation, an ROI computation using the internal-rate-of-return methodology follows.

 

COMPUTATION USING FINANCIAL CALCULATOR

 

Vendor - A

Vendor - B     

Vendor - C

In-house Development

1) Enter Trade-In value (FV)

2) Enter Product Life (n)

3) Enter Initial Cost (PV)

4) Enter Annual Savings (PMT)

5) Compute IRR (COMP)(i)

0

10

-300

200 - 100

31%

0

10

-400

250 - 100

36%

0

10

-400

280 - 100

44%

0

10

-800

280 - 150

10%

 

 

Step 3:  Compute Qualitative Criteria Index

Combining the three illustrated technical criteria requires that their relative importance be determined.  This type of importance ranking methodology (called the Delphi Method when first presented by Rand Corporation during the 1950's) includes the use of expert's rankings which are then normalized into a weighting scale running from 0 to 1.  Applying this approach to the illustration results in the following table:

 

 

Vendor - A

Vendor - B

Vendor - C

In-House

 

Weight

Value

Wt'd

Value

Value

Wt'd

Value

Value

Wt'd Value

Value

Wt'd

Value

Functionality

 

Platform

Utilization

 

Survival

Probability

.5

 

 

.2

 

 

.3

 

.7

 

 

.3

 

 

.9

.35

 

 

.06

 

 

.27

 

.9

 

 

.4

 

 

.8

.45

 

 

.08

 

 

.24

1.0

 

 

 .4

 

 

 .3

 

.50

 

 

.08

 

 

.09

1.0

 

 

 .4

 

 

1.0

.50

 

 

.08

 

 

.30

Weighted Total

.68

 

.77       

 

.67

 

.88

As a % of Perfect

68

77

67

88

The weighted value columns are the product of the weights assigned by the experts times the evaluation criteria scores contained in the table from Step 1.

Step 4:  Compute Worth Index

The computation of a quantitative worth index for the illustrative evaluation is now straight forward.

 

WORTH INDEX CALCULATION

 

Multi-product

Vendor - A

Specialized

Vendor - B

Start up

Vendor - C

In-house

Development

Technical Score

    (from Step 3)

 

ROI (from Step 2)

68

 

 

.31

77

 

 

.36

67

 

 

.44

88

 

 

10

Worth Index

(Technical Score X ROI)

 

21

28

29

.9

 

Based on the worth index, vendors B and C are approximately equal from an objective (quantitative) viewpoint.  The decision between them would be based on subjective criteria such as competitive issues and control

The worth index can be computed in three forms, using the ROI as shown in the illustration, using net present value (NPV), and using life cycle costs.  The formulas for each follow.

  • Using ROI

                                WORTH = SCORE X ROI

  • Using NPV

                                WORTH = SCORE X NPV

  • Using Life Cycle Costs

                                WORTH = SCORE � COST

The next section will discuss and structure the subjective and objective evaluation criteria relevant to scoring decisions.

Sourcing Evaluation Criteria

The evaluation criteria used in selecting sourcing alternatives can be divided into two major categories:

Objective Criteria

                These can be quantified through costing.

Subjective Criteria

These require intuitive weighing and are used for score individual criteria.  They can also be used for screening unacceptable approaches prior to the formal evaluation discussed in this paper.

The objective criteria used to compute Life Cycle Costs & ROI are discussed in a later section of this paper.  The subjective criteria evaluated through scoring are discussed in this section.

The scoring of criteria can often have different forms when applied to in-house and external vendors.  When relevant, these differences are highlighted.

Criterion 1 - End User Deliverables Functionality

When relevant, this functionality criterion evaluates the quality, from the view of the user, of the application/product/service deliverables to be provided by in-house or vendor organizations.

Criterion

What is the quality of the deliverables in terms of meeting end user defined functional requirements.

Scoring

The evaluation measures for developing a score for meeting functional requirements is completely dependant on the type of deliverable (eg. application system, processing capability, image system, strategic plan, etc.).  A small portion of a multi-page functional evaluation follows as an example of the type of approach often used.

Deliverables Functionality Example - Applications Software

 

 REQ

 7

 7.1

 7.2

 7.3

 7.4

 

 

 

Generate Monthly Reports

  Yield Analysis

  Arrears Trends

  Loan Growth

  Rate of Return

TOTAL POINTS

AVERAGE POINTS
 

Essential (1)/

Desired (.8)

 

D

E

D

 E

3.6

Standard (1)/

Custom (.5)

 

C

S

S

 C

3.0

 

Points

 

  .4

 1.0

  .8

  .5

 2.7

  .75

Deliverables Functionality Example - Data Center

 

 REQ

 5

 5.1

 5.2

 5.3

 5.4

 

 

 

Help Desk Capability

  Automated Task Status

  Automated Report Status

  Automated Input Status

  Rescheduling Capability

                                                                TOTAL POINTS

AVERAGE POINTS

Essential (1)/

Desired (.8)

 

E

E

E

 D

3.8

Standard (1)/

Custom (.5)

 

C

S

S

 C

3.0

 

Points

 

  .5

 1.0

 1.0

  .4

 2.9

  .76

 

Criterion 2 - Product/Service Life

When relevant, this criterion is used during the evaluation of products where continuous enhancement is needed over the planned life of the product or service.  Enhancement requirements can be due to such items as evolving user/legal requirements and evolving technologies.

In-House Supplier Criterion

        In-house suppliers are often assumed to have an indefinite life.  This can be very misleading if the internal enhancement skills required to maintain the product or service are not within the mainstream of IS activities.

               A.         What is the probability that the skills needed for support of the product/service will be available over the project/service life cycle?

External Vendor Criterion

                 B.         What is the probability that the firm supplying support will maintain or improve its competitive position over the project/service life cycle?

                  C.         What is the probability that the firm supplying support will still be providing adequate support over the project/service life cycle?

Criterion Applicability

                                HARDWARE:

                                                Processing                            A,C

                                                Network                 A,C

                                SOFTWARE:

                                                Applications                         A,B

                                                Systems                 A,C

The scoring of this criterion is subjective and normally based on the number of years that in-house capability has been maintained or on the number of years that a potential vendor has been supplying the product and its competitive position during those years.

Scoring

Typical evaluation measures for developing a score for the product/service life criterion with sample weights follow for in-house and vendor providers.

 

                I. Evaluating In-house Providers

                  A. Product/Service Stability             (.6)

                                  1.           At least "X" years of experience      

                                  2.           Required expertise available from other areas

                  B. Reputation of provider organization          (.4)

                                  1.           IT Management satisfaction             

                                  2.           Users satisfaction

                   TOTAL

 

 

WEIGHTS

 

  .3

  .3

 

  .2

  .2

 1.0

                 II. Evaluating Vendors

                  A. Product/Service Stability             (.3)

                                  1.           Firm at least "Y" years old

                                  2.           Product at least "Z" years old

                                  3.           Specializes in Product/Service Area

                  B. Financial Stability          (.3)

                                  1.           Profitability

                                  2.           Asset/Equity Strength

                  C. Reference Sites Reputation         (.4)

                                  1.           Product/Service Satisfaction

                                  2.           Support/Training Satisfaction

                  TOTAL

 

WEIGHTS

 

  .1

  .1

  .1

 

  .15

  .15

 

  .2

  .2

 1.0


Criterion 3 - Project Implementation Quality

When relevant, this criterion is used to evaluate the project management, implementation and maintenance support, and implementation planning quality that in-house and vendor providers intend to furnish for implementation of the product or service.

Criterion

What is the quality of the personnel to be assigned, and what is the probability that they will remain throughout the implementation period.

Scoring

Typical evaluation measures for developing a score for support quality together with sample weights follow.

 

                Implementation Quality

                   A. Project Management   (.4)

                                  1.           Project Director Quality

                                  2.           Project Implementation Team Quality

                   B. Implementation Plan    (.2)

                                  1.           Schedule Realism

                                  2.           Task Definition Realism

                   C. Operations Support     (.2)

                                  1.           Training Quality

                                  2.           Documentation Quality

                   D. Maintenance Support (.2)

                                  1.           Help Line Quality

                                  2.           Release System Quality

                   TOTAL

 

WEIGHTS

 

  .2

  .2

 

  .1

  .1

 

  .1

  .1

 

  .1

  .1

 1.0

Criterion 4 - Platform Quality and Performance

When relevant, this criterion is used to evaluate the quality & performance of the processing platform(s) that in-house and vendor providers intend to use to process the desired product/service.

Criterion

What is the cost/performance, modularity, and reliability of the platform to be used; and what is the probability that it can meet anticipated performance, growth and capability requirements over the life of the project/service.

Scoring

Typical evaluation measures for developing a score for the processing platform, together with sample weights follow. 

                Processing Platform Quality

                                A. Platform Performance     (.2)

                                  1.           Anticipated online performance

                                  2.           Anticipated batch performance

                                B. Software Availability      (.2)

                                  1.           Development Software Quality

                                  2.           Applications Software Quality

                                C. Platform Vendor Quality (.2)

                                  1.           Firm at least (3 x technology cycle) years old

                                  2.           Financial Strength

                                  3.           History of Stability & Growth

                                D. Hardware Components Quality    (.2)

                                  1.           Product Line at least (2 x technology cycle) years old

                                  2.           Quality & Support Reputation

                                  3.           Expandable

                                  4.           Availability of Compatible Systems

                                E. Systems Software Components Quality         (.2)

                                  1.           Product Line at Least (1 x technology cycle) years old

                                  2.           Quality & Support Reputation

                                  3.           Enhancement Reputation

                                  4.           Availability of Alternatives

                                TOTAL

 

WEIGHTS

 

  .1

  .1

 

  .1

  .1

 

  .1

 .05

 .05

 

 

  .05

  .05

  .05

  .05

 

  .05

  .05

  .05

  .05

 1.00


Criterion 5 - Support Quality

When relevant, this criterion is used to evaluate the quality of support/service anticipated from in-house and vendor providers.

  • Criterion

What is the quality of the persons and organizations supporting the project throughout the operational life of the project/service?

  • Scoring

Typical evaluation measures for developing a score for Support Quality, together with sample weights follow.

 

                   Support Quality

                                A. Operations Support/Service         (.6)

                                  1.           Staff Quality

                                  2.           Training Quality

                                  3.           Documentation Quality

                                B. Maintenance Support/Service      (.4)

                                  1.           Help Line Staffing Quality

                                  2.           Release Procedure Quality

                               TOTAL

 

WEIGHTS

 

  .3

  .15

  .15

 

  .2

  .2

 1.0

Criterion 6 - End User Deliverables Architecture Quality

When relevant, this architecture criterion evaluates, from the view of the IT organization, the quality of the application / product / service deliverables to be provided by in-house or vendor organizations.

Criterion

What is the quality of the deliverables in terms of optimum balancing of their technology architecture's flexibility, effectiveness, and efficiency?

Scoring

Typical evaluation measures for developing a score for Deliverables Architecture Quality, together with sample weights follow. 

 

                   Deliverables Architecture Quality

                                A. System Design Flexibility              (.4)

                                  1.           Parametric Product Definition

                                  2.           Modularity of Options

                                B. System Structure Effectivity          (.3)

                                  1.           Development Productivity

                                  2.           Production Efficiency

                                  3.           Technology Reliability

                                C. Documentation Quality  (.3)

                                  1.           HELP Screens

                                  2.           USER Documentation

                                  3.           IT Documentation

                                TOTAL

 

WEIGHTS

 

  .2

  .2

 

  .1

  .1

  .1

 

  .1

  .1

  .1

 1.0

 

Criterion 7 - Provider Infrastructure

As relevant, this infrastructure criteria evaluates the fit between user and IT consumer organizations and in-house or vendor providers.

Criterion

What is the level of agreement between the consuming and providing organizations in terms of factors such as: management style, technology innovation, standards utilization, and productivity or quality tradeoffs.

Scoring

Typical evaluation measures for developing a score for provider compatibility, together with sample weights follow.

 

                   Provider Compatibility

                                A. Industry Understanding and Commitment (.2)

                                  1.           Research and Development Commitment

                                  2.           Staff Development Commitment

                                B. Contract Terms and Conditions    (.15)

                                  1.           Initial Arrangements

                                  2.           Renegotiation for Special Conditions

                                  3.           Takeback Arrangements

                                B. Management Style Compatibility (.05)

                                  1.           Structural Formalism

                                  2.           Monitoring and Control

                                  3.           Staffing and Career Paths

                                C. Standards Compatibility (.2)

                                  1.           Planning Methods

                                  2.           Development Methods

                                  3.           Production Methods

                                  4.           Communication Methods

                                  5.           Data Base Methods

                                D. Productivity and Quality Orientation          (.2)

                                  1.           Development Performance

                                  2.           Production Performance

                                E. Innovation Orientation   (.2)

                                  1.           Development Technology

                                  2.           Production Technology

                                TOTAL

WEIGHTS

 

  .1

  .1

 

  .05

  .05

  .05

 

 .01

 .02

 .02

 

  .1

  .025

  .025

  .025

  .025

 

  .1

  .1

 

  .1

  .1

 1.0

 

Criterion 8 - User References

As relevant, this criterion evaluates the results of the provider's user site visits and/or references.

Criterion

What is the quality of the provider's reference sites, and how do their users evaluate the commitments, quality of products/services, and level of support provided.

Scoring

Typical evaluation measures for developing a score for User References, together with sample weights follow.

       User References

                                A. Company Management's Evaluation

                                B. IS Management's Evaluation

                                C. Professional Staff's Evaluation

                                D. User Staff's Evaluation

                                TOTAL

 

WEIGHTS

  .25

  .25

  .25

  .25

  1.0

Sourcing Cost Categories

The objective of the costing process is to present a complete and understandable set of current system costs for the denominator of the worth index, so that alternative providers can provide comparable pricing. 

The steps generally used to develop the costs needed involve a) determining relevant functions for organizations or locations with the potential to be outsourced, b) producing a functional cost analysis for each, c) obtaining prices from potential providers, and d) adjusting bids to produce comparable life cycle costs for each feasible alternative.

Guidelines for preparing and analyzing appropriate costs are presented in the next portion of this paper.

Step 1: Determine Relevant Cost Types

Using the following staffing and costing categories as an example, define the categories appropriate for the organizations and functions to be evaluated and fill in Functional Costs Analysis Forms for each organization and business function.

TYPICAL COST CATEGORIES

 

Personnel Related Costs

 FTE*

 Salary

 Benefits

 Bonus/Incentives

 Communication

 Memberships & Dues

 Occupancy

 Professional Fees

 Supplies

 Training

 Travel Related

 Misc. Personnel Costs

 

Equipment Related Costs

 Hardware Depreciation

 Software Depreciation

 Rental/License

 Hardware Maintenance

 Software Maintenance

 Technical Support

 Fixed Assets Depreciation

 Misc. Equipment Costs

 

Network Related Costs

 Voice Network

 Data Network: WAN

 Data Network: LAN

 Telecomm Hardware

 Telecomm Software

 Other Network Assets

 

Specialized Data Center Costs

 Processors

 DASD

 Tapes/Cartridges

 Printers

 Terminals

 Maintenance Contracts

 Systems Software

 Applications Software

 Misc. Data Center Costs

 

 

 

 

Other Costs

 Cost of Capital

 Data Center Charges

 Facilities/Utilities

 Furniture/Fixtures

 Insurance

 Outsourced Services

 Off-Site Storage

 Property Taxes

 Publishing/Printing

 Security/Disaster Recovery

 Transportation/Vehicles

 Misc. Other Costs

 

 

* Non-cost items

 

Step 2: Determine Costs of Relevant Functions

Using a form similar to the IS Functional Cost Analysis Form that follows, a life cycle cost analysis should be prepared for all involved organizations or functions, that details by cost category, the current and anticipated life cycle costs.

A portion of a typical completed form follows.  The spread sheet function used for the future value calculation used in the last column follows:

                                @FV(interest, term, payments)

To compute the first row use @FV(.06,10,30) yielding 395.         

SAMPLE OF PARTIAL "IS FUNCTIONAL COST ANALYSIS FORM"

 

Organization:  Retail Banking Group

Function:  Application Systems Development

Location:  Headquarters

 

Equipment Related Costs

 

Current

Annual

Costs

 

Annual

Growth

Factor

 

Total

Life Cycle

Costs

 Hardware Depreciation

$          30

 

6%

 

$         395

 Software Depreciation

200

 

8%

 

2,897

 Rental/License

700

 

4%

 

8,404

 Hardware Maintenance

80

 

4%

 

960

   

 

 

 

 

 

Step 3: Produce Alternative Life Cycle Costs

A series of forms, similar to the Alternative Life Cycle Costing Form that follows, should be completed, one for each feasible provider, during the proposal preparation period.  The forms should be used for both initial (one time ) costs and for life-cycle (recurring) costs.  Note that the Current Project Costs column is normally not relevant when the form is used for Initial Costs.

A portion of a typical completed form follows.

SAMPLE OF PARTIAL "ALTERNATIVE LIFE CYCLE FORM"

 Provider: VENDOR B

                O-Initial     O-Life-Cycle

Organization:  Retail Banking Group

Function:  Application Systems Development

Location:  Headquarters

 

Personnel Related Costs

 

Current

Project

 

 

Retained

 

Bid

by

Vendor

Total

Project

 

 

Change

 FTE Employees

300 

40 

230 

270 

-30 

 Salary

$ 14,000

$  1,200

$ 10,200

$11,400

$(2,600)

 Benefits

2,400

200

1,650

1,850

(750)

 Bonus/Incentives

1,100

0  

750

750

(350)

 Travel Related

80

15

70

85

5

 

 

 

 

 

 

                * Note: all dollar values in thousands

 

Step 4: Produce Life Cycle Cost Summary of Each Alternative

The final step in preparing the cost data for use in the Worth Analysis, is to summarize the initial and recurring costs for each alternative.  A table similar to the following can be used to present the summaries and to show relevant ROI.

Typical Life-Cycle Savings and ROI Summary


 

Quantitative Criteria

Vendor A

Vendor B

Vendor C

In-House

 

 Initial Costs Period one)

 

 

$300

 

$400

 

$400

 

$10

 

 Current Project Costs*

              less

 Total Bid Project Costs*

 

            Savings (Loss)*

 

 

200

 

100

 

$100

 

250

 

100

 

$150

 

280

 

100

 

$180

 

280

 

150

 

$130

ROI**

.31

.36

.44

.10

     * Sum of annual total dollars in thousands from step 3 (e.g. total 10 year costs or savings).

     * See System-Life Oriented Presentation Methodology on page 9 for calculation approach.

Step 5:  Compute Worth Index

The computation of an illustrative quantitative worth index is now straight forward.

 

WORTH INDEX CALCULATION

 

Multi-product

Vendor - A

Specialized

Vendor - B

Start up

Vendor - C

In-house

Development

Technical Score

    (initial table)

 

ROI (from Step 4)

68

 

 

.31

77

 

 

.36

67

 

 

.44

88

 

 

.10

Worth Index

(Technical Score X ROI)
 

21

28

29

9

Based on the Worth Index, Vendors B and C are approximately equal from an objective (quantitative) viewpoint.  The decision between them would be based on subjective criteria such as competitive issues and internal control.

Calculation of the worth index using ROI is illustrated in the above table.  Alternatively, the worth index can be calculated using NPV or life-cycle costs using the following formula.

  • Using ROI

                                WORTH = SCORE X ROI

  • Using NPV

                                WORTH = SCORE X NPV

  • Using Life Cycle Costs

                                WORTH = SCORE � COST

Worth Index Oriented Presentation Methodology

The following chart has been useful in presenting the results of the Worth Index methodology to management.  Two of the loan application scores were noticeable close, while there was an obvious winner in the finance area.  This is typical based on the author�s experiences.

The final decision was based on site visits to vendor-A and vendor-B user sites.

                         Platform Architectures: MF is mainframe, HP is high performance, PC is PC/LAN, and AS is a mini

System Life Oriented Presentation Methodology

There are two basic sourcing approaches: non-competitive and competitive bidding.  To this point, this paper has considered only competitive bidding.  The Savings vs. System Life presentation chart shown below, can be used for displaying either a) an RFP methodology winner�s ROI using the savings vs. system-life profile; b) an RFQ methodology in which the lowest cost vendor is selected whose management and technical proposals are evaluated as feasible or c) a sole source (often in-house) provider (Rosenthal 1991). 

The calculations are normally done using a spreadsheet.  A formula approach follows.

      Col A               Col B                   Col C                           Col D 

Row 5         Cash Flows           first period                second period                     third period

                                                   net cash flow               net cash flow                        net cash flow

 

Row 10       IRR                             -1                     @IRR(B10,$B5...C5)             copy formula from prior column

 
Experience with the Methodology

The original concept for this evaluation methodology dates from 1991, when The Sourcing Interests Group requested one of the authors to design a reusable methodology for their members use.   Since that time, the authors have used the methodology numerous times during their consulting practice.  Regrettably the reports from these projects are confidential.   The proceedings by the authors (Park 2003, Rosenthal 1991 and 2003) reference presentations that included brief overviews of the application of the process. 


References

      AICPA  Internal Control Reporting - Implementing Sarbanes-Oxley Section 404, AICPA Paperback.
www.aicpa.org  also see www.404institute.com

      Halifax (2000).  �Evaluation Framework for the Request for Proposals�.  Halifax Harbour Solutions Project
      http://www.region.halifax.ns.ca/harboursol/rfp_evaluation.pdf

      HIPAA.  Health Insurance Portability and Accountability Act of 1996 www.cms.hhs.gov/hipaa .

     Park, L. Jane and Rosenthal, Paul (2003).  �Costing and Presentation Approach for an Information Systems Project�,
  Proceedings of the Hawaii International Conference on Business
.  Honolulu, June 18-21, 2003.

    Rosenthal, Paul (1991).  �Graphical Analysis of IS & MS Project Economics: The Media is the Message�. 
 Proceedings of the ORSA/TIMS Joint National Meeting
, Anaheim CA, November 3-6, 1991.

    Rosenthal, Park and L. Jane Park (2003).  �Outsourcing Evaluation Approach for an Information Systems Project�. 
 Proceedings of the International Business & Economics Research Conference
.  Las Vegas, October 6-10, 2003.

    The European Commission, Fifth Framework Program (2001).  �Evaluation Procedures for the Programme,
 User- Friendly Information Society�.  http://www.cordis.lu/fp5/management/eval/r_evaluation.htm


[1]       Net present value is not used here because it also requires a forecast of cost of funds over the project life cycle.
 


For Further Information Contact

Dr. Paul Rosenthal
Information Systems Department, ST603
California State University, Los Angeles
5151 State University Drive
Los Angeles, CA 9032-8123

[email protected]



Search Our Site

Search the ENTIRE Business Forum site. Search includes the Business
Forum Library, The Business Forum Journal and the Calendar Pages.


Disclaimer

The Business Forum, its Officers, partners, and all other
parties with which it deals, or is associated with, accept
absolutely no responsibility whatsoever, nor any liability,
for what is published on this web site.    Please refer to:

legal description


Home    Calendar    The Business Forum Journal     Features    Concept    History
Library     Formats    Guest Testimonials    Client Testimonials    Experts    Search
News Wire
      Join Why Sponsor      Tell-A-Friend      Contact The Business Forum



The Business Forum
Beverly Hills, California United States of America

Email:  [email protected]
Graphics by DawsonDesign
Webmaster:  bruceclay.com


� Copyright The Business Forum Institute 1982 - 2012  All rights reserved.