concerns as
ational size;
ve. Perhaps
System may
iculating the
, 1991]. One
for another,
short or long
Jy for staff of
for another
e. Decision
ntly at each
implications
NS and their
mper, 1992]
1e criteria in
the filtered
a matching
ind C2 are
ised on how
experience,
and tests,
tools and
xpectations
ighed rating
yer function,
weights is
nclude the
lection per
methodology per evaluator based on the contingency model
(OSAD,ISAD viewed in relation to Aspects, as well as other
criteria).
¢ STEP 9: The preceding step has its level of subjectivity
(per evaluator) which has to be brought to a more objective
level of choice. This can be achieved using the MECCA
technique (Multi Element Component Comparison
Analysis). Composite results per evaluator are put together
in a table. The result of this step is a priority list of
methodologies with the highest total scores to be
considered either per phase of the Object System or as a
whole, but preferably per phase; so that a methodology can
be seen to be transparently suitable for a particular phase,
or otherwise. It should be noted that other considerations
within the MECCA technique (like weighting) are already
embedded in steps 2 through 6.
¢ STEPS 10 AND 11: The Final Decision on Development
Strategy can be an approach using:
(i) Prototypes, (ii) Specifying during the System Life Cycle,
or (iii) A Mixture of both
The determination can be effected by considering both the
degrees of uncertainty and complexity as contingencies
relevant for the evaluation of the Object System for
Strategy determination. This line of reasoning has become
acceptable though with variations [Episkopou and Wood-
Harper, 1986; Burns and Denis, 1985 ; Modha et al, 1990].
To arrive at an optimum Strategy and Decision, the above
stated contingencies are used to assess the Object System
inorder to determine the Uncertainty and Complexity levels
of the development process.
Em N
Mia 6 L---------- :
1 1 L '
1 ' ' '
' | ' 1
| 1 I 1
1 1 ' 1
ben 0 = d A------- L……-
10 24
Legend : Mia Max TUS
ifying
[7] Prototyping
Figure 6 An Example of a Decision Matrix for
Development Strategy
CONCLUDING REMARKS
The Selection of an Optimum methodology for developing
any Information System has to be preceded by a thorough
understanding of trends in Information System
Development, the nature of the Object System, as well as
a good reasoned analysis of any criteria and contingencies
necessary for use in the selection process.
The criteria score can directly be used without the phases
weighting if one does not intend to evaluate the
methodologies over the entire phases of the Object
System. It can be reasoned that if it is accepted that no
single methodology has yet proven to be satisfactory over
the entire Systems life cycle, then a framework for
evaluation may not seek for such a methodology; rather it
can concentrate on refining the criteria C, and examining
C, in terms of their importance relative to each other.
The question of levels of decomposition and aspects
become apparent when one considers that methodologies
do have specific orientations and the techniques they
recommend prove either too detailed or too broad to be
sufficiently used in some phases.
Step ten of the framework (Establishment of development
strategy) may precede step zero depending on any other
experiences of the evaluation committee. This is so since
a step-ten result of Low complexity + Low uncertainty may
require only prototyping , in which case it would no longer
be expedient to continue evaluating the Structured
methodologies. But even prototyping may not be carried out
without some form of preliminary studies. In that case, a
soft approach may become handy.
The evaluation model lends itself to criticisms but it is
hoped that the flexibility offered would make it applicable to
a wide range of information systems.
REFERENCES
Avison,D.E. and Fitzgerald,G., 1988. Information System
Development: Methodologies, Techniques, and Tools.
Blackwell Publications, Oxford.
Burch,J.G.(Jnr.),Strater,F.R.,and Grudnitski,G., 1979.
Information Systems: Theory and Practice (2nd.Ed.). John
Wiley & Sons, New York.
Burns,R.N. and Dennis A.R., 1985. Selecting the
Appropriate Application Development Methodology. Data
Base Fall, pp.19-23.
Cutts,G.,1991. Structured Systems Analysis and Design
Methodology (2nd Edition). Blackwell Scientific Publications,
London.
Davis,G.B., and Olson,M.H. 1985. Management
Information Systems: Conceptual Foundations, Structure,
and Development. (2 Ed.) McGraw Hill, Singapore.
Dippel,G. and House,W.C., 1969.Information Systems:
Data Processing and Evaluation. Scott, Foresman & Co.,
Illinois.
Dutch User Group of Structured Development
Methodologies", 1990.16 Methodologies for Systems
Development: A Comparative Review.
Elfving,A.and Kirchoff,U.,1991. Design Methodology for
Space Automation and Robotics Systems. ESA Journal
Vol.15, pp.149-164.
Episkopou,D.M. and Wood-Harper,A.T.,1986. Towards a
Framework to Choose Appropriate IS Approaches. The
Computer Journal, vol. 29, No. 3, pp.222-228.
Essien,O.U.,1992. Selection of an Optimum Structured
Methodology for the Development of Topographic Database
Systems. MSc. Thesis, ITC.
265