The Electronic Journal of Information Systems Evaluation provides critical perspectives on topics relevant to Information Systems Evaluation, with an emphasis on the organisational and management implications
For general enquiries email administrator@ejise.com
Click here to see other Scholarly Electronic Journals published by API
For a range of research text books on this and complimentary topics visit the Academic Bookshop

Information about the European Conference on Information Management and Evaluation is available here

linkedin-120 

twitter2-125 

fb_logo-125 

 

Journal Article

Questionnaire Based Usability Evaluation of Hospital Information Systems  pp21-30

Kai-Christoph Hamborg, Brigitte Vehse, Hans-Bernd Bludau

© Jan 2004 Volume 7 Issue 1, Editor: Dan Remenyi, pp1 - 66

Look inside Download PDF (free)

Abstract

The widespread distribution of HIS requires professional evaluation techniques. In this study we present a usability questionnaire called IsoMetrics which is based on the international standard ISO 9241 Part 10. The questionnaire was applied to assess the usability of a Hospital Information System. The equivalence of the online and a paper‑and‑pencil format of the questionnaire were investigated. The results show that the different formats do not affect the subject's ratings. IsoMetrics was proven to be a reliable technique for software evaluation in the field of hospital information systems supporting usability screenings in large organisations.

 

Keywords: Evaluation, usability, ISO 9241 Part 10, Hospital Information Systems, HIS, online questionnaire

 

Share |

Journal Article

The Evaluation of Software Quality Factors in Very Large Information Systems  pp43-48

Souheil Khaddaj, G Horgan

© Jan 2004 Volume 7 Issue 1, Editor: Dan Remenyi, pp1 - 66

Look inside Download PDF (free)

Abstract

A quality model links together and defines the various software metrics and measurement techniques that an organisation uses which when measured, the approach taken must be sufficiently general for hybrid hardware and software systems. In this work software quality factors that should be taken into account in very large information systems will be considered. Such systems will require a high degree of parallelism and will involve a large number of processing elements. We start by identifying the metrics and measurement approaches that can be used. Many of the quality factors would be applied in similar way for sequential and paralleldistributed architectures, however a number of factors will be investigated which are relevant to the parallel class. In such a system many elements can fail which can have major impact on the system's performance, and therefore it affects the costbenefit factors. Portability and usability are other major problems that need to be taken into account when considering all the relevant factors that affect quality for such environments.

 

Keywords: Quality Modeling, Quality Measurement, Software Quality, Very Large Information Systems, Distributed Computing

 

Share |

Journal Article

A Chronic Wound Healing Information Technology System: Design, Testing, and Evaluation in Clinic  pp57-66

Antonio Sánchez

© Jan 2004 Volume 7 Issue 1, Editor: Dan Remenyi, pp1 - 66

Look inside Download PDF (free)

Abstract

In the UK, chronic wound healing is an area of specialist clinical medicine that operates within the framework of the National Health Service. It has been the basis for the design, testing and evaluation of a prototype system of information and communication technology (ICT), specifically adapted to the domain. Different wound healing clinics were examined using a combination of 'hard' and 'soft' methods to allow a richer perspective of the activity and gain a deeper understanding of the human activity, its relation to the working information system, the existing information technology (IT), and the potential of a comprehensive IT system to manipulate live data in clinic. Clinicians and administration staff were included in all aspects of the process to enhance the design lifecycle and the understanding of the process. An observe, report, plan and act (ORPA) cycle, based on the dictates of action research, was established to accomplish the design and testing of a system that clinicians were comfortable enough with to consider its use in clinic. Three different strategies were applied to evaluate its use in participating clinics. Cultural historical activity theory was used as the main framework to analyse the activity system, and to interpret the clinicians and the systems performance, as well as their evaluation of the experience. Activity breakdown areas are suggested and reasons for them are considered in the light of wound care workers feedback, and the researcher's observations, notes, and analysis.

 

Keywords: Electronic data manipulation, clinical ICT, information technology evaluation

 

Share |

Journal Article

Modelling Risks in ISIT Projects through Causal and Cognitive Mapping  pp1-10

Abdullah J. Al-Shehab, Robert T. Hughes, Graham Winstanley

© Jan 2005 Volume 8 Issue 1, Editor: Dan Remenyi, pp1 - 80

Look inside Download PDF (free)

Abstract

Software systems development and implementation have become more difficult with the rapid introduction of new technology and the increasing complexity of the marketplace. This paper proposes an evaluation framework for identifying the causes of shortfalls in implemented information system projects. This framework has been developed during a longitudinal case study of a problematic project, which is described.

 

Keywords: causal and cognitive mapping, project evaluation, information systems project risk

 

Share |

Journal Article

Seven Ways to get Your Favoured IT Project Accepted — Politics in IT Evaluation  pp31-40

Egon Berghout, Menno Nijland, Kevin Grant

© Jan 2005 Volume 8 Issue 1, Editor: Dan Remenyi, pp1 - 80

Look inside Download PDF (free)

Abstract

IS managers are being put under increasing pressure to justify the value of corporate ITIS expenditure. Their constant quest for the 'holy grail' continues, as existing methods and approaches of justifying ITIS expenditure are still failing to deliver. The decision making process is not as objective and transparent as it is claimed or intended to be. This paper discusses seven commonly used tactics used by business managers to influence IT appraisals. The paper takes a 'devil's advocate' position and adopts some irony when looking at the area of power and politics in IT evaluation. Rather than promoting the use of these techniques, this article aims to raise awareness that IT evaluation is not as rational as most IT evaluation researcherspractitioners would want it to be or indeed claim it to be. It is argued that rationalisation or counter tactics may counteract influence techniques in an attempt to get behind the cloak and dagger side of organisational power and politics, but politics and power in decision‑making cannot and should not be filtered out. Due to dissimilarities of objectives, limitations of time and information, influence techniques will always be used. However, rather than being counterproductive, these techniques are essential in the process of decision making of IT projects. They help organisations reach better decisions, which receive more commitment than decisions that were forced to comply with strictly rational approaches. Awareness of the influence and manipulation techniques used in practice will help to deal with power and politics in IT evaluation and thereby come to better IT investment decisions.

 

Keywords: IT Evaluation, IT Decision Making, IT Assessment, Information Economics, Decision Making, Organisational Power & Politics Information Management

 

Share |

Journal Article

Exception‑Based Approach for Information Systems Evaluation: The Method and its Benefits to Information Systems Management  pp51-60

Heikki Saastamoinen

© Jan 2005 Volume 8 Issue 1, Editor: Dan Remenyi, pp1 - 80

Look inside Download PDF (free)

Abstract

Exceptions are events that cannot be handled by an information system by following normal processing rules. Exceptions arise for two main reasons: flaws in system design and post implementation changes in the system domain. Only few exceptions should arise in an information system serving its user community well. In practice, this is rarely the case and exceptions are sometimes rather common even with routine processes. In this paper, an exception‑based approach to evaluate information systems is presented together with practical examples of its use. The benefits of the analysis to information system management are elaborated on.

 

Keywords: Information Systems Evaluation, Exception Handling, Information Systems Management

 

Share |

Journal Article

Broadening Information Systems Evaluation Through Narratives  pp115-122

Jonas Hedman, Andreas Borell

© Sep 2005 Volume 8 Issue 2, Editor: Dan Remenyi, pp81 - 142

Look inside Download PDF (free)

Abstract

The purpose of information systems post‑evaluation ought to be to improve the use of systems. The paper proposes the use of narratives as a tool in post‑evaluations. The potential in narratives is that they can convey meanings, interpretations, and knowledge about the system, which may potentially lead to action. The paper offer three main suggestions: 1) evaluations should form the basis for action; 2) narratives makes evaluation more relevant; and 3) post‑evaluations should be done with the aim of improving use. Narratives should be viewed as a complement to traditional evaluation methods and as a way of making evaluation more formative and thereby moving away from the more common summative perception of evaluation. The conclusion of the paper is that narratives can advance IS evaluation and provide a richer evaluation picture by conveying meanings not included in traditional evaluations.

 

Keywords: Narratives, information systems evaluation, measurements, measure, stories, action

 

Share |

Journal Article

A Process Capability Approach to Information Systems Effectiveness Evaluation  pp7-14

Sevgi Ozkan

© Mar 2006 Volume 9 Issue 1, Editor: Dan Remenyi, pp1 - 43

Look inside Download PDF (free)

Abstract

While defining or measuring the effectiveness of the information systems (IS) function has proven complicated, further effort on refining IS assessment is essential for the effective management and continuous improvement of both the IS function and the organisation. In addition, an effort to investigate the relationships among the established IS assessment tools to better reconcile their existing differences is warranted. This paper aims to clearly differentiate the notions of 'Software' from 'Information Systems'. A new IS assessment model is proposed to provide a more holistic view on how IS quality may be assessed by means of a process capability understanding of evaluating IS effectiveness within the organisational context.

 

Keywords: Information systems quality, Information systems effectiveness, Assessment, Software process maturity, Process capability

 

Share |