infrastructure
f a Feasibility
iled appraisal
1d to develop
System.
ering design
3 changes to
d specifying
ey would use
|| and Detail
nentation and
objectives, it
own on paper
thy the whole
A GENERAL EVALUATION MODEL CRITERIA FOR
SELECTING AN OPTIMUM STRUCTURED DEVELOP-
MENT METHODOLOGY
Alot of Structured Development Methodologies exist in the
Information Science/ Technology scene. Some look similar
in many respects, some use techniques that overlap in
concepts, the advantages or otherwise of some are more
apparent in some types of Systems than the others, and
some lay more emphasis on some aspects of Information
Systems development. This state of affairs leaves the
Systems Analyst and Designer in a quandary over which of
these methodologies could suit a particular Systems
Development project. The reported Research focuses on
the determination of a general evaluation model criteria for
selecting an optimum Structured Development Methodology
for the development of a wide range of Information
Systems.
Framework for Evaluation:
This framework (see Figure 3) can be conceptualised as a
scenario in which the problem domain or Object System
(the Organisation, considered as a problem area) is to be
assessed against a background of many recommended
approaches (or methodologies) for the development of
information systems.
EVALUATION
RULES
* How decisions can be taken on very closely scored
methodologies (could be through increasing the levels
of details evaluated)
The Different Methodologies would be introduced into the
evaluation framework
(matrices or tables) determined in the evaluation model.
The Evaluation Model would simply be a model of selection
processes, resulting in a priority list of methodologies,and
also in a final decision. This model is the core of the
scenario and is blown further as shown in Figure 4.
Further Explanation of the Framework
€ STEP O: This involves outlining of the principal phases
through which the development process for the Object
System would (or should) have to go through. In the
presented example of Topographic Database Systems, the
phases of development are [Radwan, undated] Design,
Implementation and Operation.
€ STEP 1: Here the functions and/or activities within the
phases defined above are listed.
DIFFERENT
METHODOLOGIES
AND
TECHNIQUES
PRIORITY LIST
OF | > FINAL
METHODOLOGIES DECISION
AND TECHNIQUES
Figure 3 An Overview of the Evaluation Process
The Evaluation Rules should include the following
guidelines (based on the nature of the problem area for the
Object System):
* The composition of an evaluation committee
* Aspects (perspectives) of methodologies to be
evaluated
* Level of decomposition (as provided for by the
methodology) to be considered or seen as desirable.
* The level of details to be evaluated per methodology
* Howthe extent of subjectivity in evaluation (particularly
Weighting) can be reduced (could be through iterations
using the Delphi Technique of quantitative studies)
€ STEP 2: The importance of each phase and function/
activity is weighted. This is done independent of any
considerations for any methodology. It would have more to
do with the peculiar circumstances of our Object System,
and each evaluator would state reasons why a particular
phase, function/activity is given more importance relative to
the rest. Viewed as an individual judgement, weighting can
be highly subjective because the phases are not mutually
exclusive and the evaluators have different perceptions and
experiences. Through iterations, the evaluators can re-
evaluate their weights (where discrepancies are
unacceptable) and move towards an acceptable mean
261