The Electronic Journal of Information Systems Evaluation provides critical perspectives on topics relevant to Information Systems Evaluation, with an emphasis on the organisational and management implications
For general enquiries email administrator@ejise.com
Click here to see other Scholarly Electronic Journals published by API
For a range of research text books on this and complimentary topics visit the Academic Bookshop

Information about the European Conference on Information Management and Evaluation is available here

linkedin-120 

twitter2-125 

fb_logo-125 

 

Journal Article

Peer Assessment: A Complementary Instrument to Recognise Individual Contributions in IS Student Group Projects  pp61-70

Elsje Scott, Nata van der Merwe, Derek Smith

© Jan 2005 Volume 8 Issue 1, Editor: Dan Remenyi, pp1 - 80

Look inside Download PDF (free)

Abstract

This paper discusses peer assessment as a component of the assessment strategy used for Information Systems student group projects at a South African university. The value of peer assessment and the contribution to the real‑life experience offered by group projects, will be discussed. It will also illustrate how this process adds value by enhancing deep learning. Its value as a complementary assessment instrument in a multiple assessment strategy and how the results of peer assessment are used to recognise individual contributions to group performance will be illustrated. The use of peer assessment as an instrument for both informal formative assessment and formal summative assessment will be described. To perform the peer assessment specific instruments were designed and used throughout the lifecycle of the course.

 

Keywords: Peer assessment, group work, assessment, self-assessment, IS Project

 

Share |

Journal Article

The Eleven Years of the European Conference on IT Evaluation: Retrospectives and Perspectives for Possible Future Research  pp81-98

Egon Berghout, Dan Remenyi

© Sep 2005 Volume 8 Issue 2, Editor: Dan Remenyi, pp81 - 142

Look inside Download PDF (free)

Abstract

This paper provides an overview of the papers that have been presented at the European Conference on IT Evaluation during the past eleven years. It considers the main issues, and learning themes addressed in papers presented to these Conferences. The paper also reflects on the possible future direction, which this research may take and three major research themes are suggested. Some 356 papers have been presented at ECITE. Over the eleven year period it is clear that the level of understanding as reflected in the papers has significantly increased. Themes, which were particularly well addressed, include IT and IS value, the multidisciplinary nature of evaluation, the importance of stakeholder analysis, organisational learning and life cycle management. Three issues are identified as particularly important for further research. These are, the theoretical underpinning of IT evaluation, improving the data sets for research and establishing a more common core of concepts.

 

Keywords: IT, IS, Evaluation, Theoretical frameworks, empirical research, case studies, questionnaires, core concepts, corporate politics, data sets, research maturity

 

Share |

Journal Article

Common Gaps in Information Systems  pp123-132

Juha Kontio

© Sep 2005 Volume 8 Issue 2, Editor: Dan Remenyi, pp81 - 142

Look inside Download PDF (free)

Abstract

Information systems and databases in six Finnish organizations are evaluated in this multiple case study research. The main idea of the research was to describe the main gaps in information systems in the case organizations. In each case the gaps are presented with authentic descriptions. The research identified altogether seven different categories of gaps. These are first abstracted to four common categories of gaps: 1) data, 2) infrastructure, 3) turning data into information and 4) people working with the information systems. Finally, the four categories are further abstracted to two common categories of gaps: 1) information and 2) infrastructure.

 

Keywords: Information Systems, IS-Gaps, Databases, Case Study

 

Share |

Journal Article

Evaluating Success in Post‑Merger IS Integration: A Case Study  pp143-150

Maria Alaranta

© Jan 2006 Volume 8 Issue 3, ECITE 2005 Special, Editor: Dan Remenyi, pp143 - 230

Look inside Download PDF (free)

Abstract

Despite the importance of post‑merger IS integration to the success of the whole merger, post‑merger IS inte‑ gration literature remains scarce. This paper attempts to synthesise the often implicit or vague definitions of post‑merger IS integration success with those provided in the vast body of literature on IS evaluation. As a result, four categories of success issues for post‑merger IS integration are proposed: User satisfaction with the integrated software's system and information quality as well as its use; Efficient and effective IS integration management; Efficient IS staff integration; and IS ability to support the underlying motives of the merger.

 

Keywords: IS Integration, Mergers, Acquisitions, M&A, Success, IS Evaluation

 

Share |

Journal Article

IS Evaluation in Practice  pp169-178

Ann Brown

© Jan 2006 Volume 8 Issue 3, ECITE 2005 Special, Editor: Dan Remenyi, pp143 - 230

Look inside Download PDF (free)

Abstract

IS evaluation exercises continued to engender scepticism. The evaluation of IS investment is considered a 'wicked problem' and there are good reasons for this judgement. The topic has attracted many researchers. There is a substantial body of literature on the problems of measurement and the inadequacies of traditional investment appraisal methods. A wide range of alternative tools has been proposed to replace these approaches. But many surveys of actual practice have established little evidence of their use. Reported IS evaluation practice appears to be relatively unsophisti‑ cated or absent in many organisations. This paper draws on existing literature and case material to analyse the problem facing organisations when planning an IS evaluation exercise. It argues that the factors that can undermine the effectiveness of IS evaluation projects pose ma‑ jor problems. Management apathy may be a rational response to a complex and difficult exercise that often yields little benefit to the organisation.

 

Keywords: IS evaluation, Failure-prone decision process, IS Business value, IS evaluation project

 

Share |

Journal Article

Evaluating e‑Commerce Success — A Case Study  pp15-26

Shaun Pather, Dan Remenyi, Andre de la Harpe

© May 2006 Volume 9 Issue 1, Editor: Dan Remenyi, pp1 - 43

Look inside Download PDF (free)

Abstract

The business community in the past decade has been characterised by debate over the value or effectiveness of e‑Commerce and how this type of technology needs to be implemented. During this period the business world has witnessed many examples of failures of Internet based business. There is little doubt that the high failure rate in Dot.Coms had much to do with misconceptions regarding the ease with which e‑Commerce could be implemented. Unrealistic expectations caused tried and tested business rules to be abandoned as hyperbole over took sound business sense. Although it is clear today that the Internet and the Web can facilitate business processes to add value to organisations, this technology has to be managed with considerable care. This paper reports on a case study conducted in kalahari.net, a well known South African e‑Tailing business. This case study highlights several valuable lessons to do with the evaluation of an e‑Commerce investment and how to ensure its success. Specifically the case study closely examines aspects of kalahari.net's IS management policy, and identifies a set of preliminary e‑Commerce success dimensions.

 

Keywords: e-Business, e-Commerce, Internet business, web-facilitated business, Information Systems Management, business evaluation, IS success

 

Share |

Journal Article

Evaluating Information Systems according to Stakeholders: a pragmatic perspective and method  pp73-88

Jenny Lagsten

© Jan 2011 Volume 14 Issue 1, ECIME 2010 Special Issue, Editor: Miguel de Castro Neto, pp1 - 166

Look inside Download PDF (free)

Abstract

In the last decade several researchers have addressed the problem that there does not seem to be much evidence of extensive use of interpretive evaluation approaches in practice. Researchers have though recognized the interpretive evaluation approach as well founded academically and theoretically offering potential advantages such as stakeholder commitment and learning opportunities. One reason for this non‑use could be that there are few, if any, interpretive evaluation methods ready at hand for evaluators in practice. An interpretive IS evaluation method means a method in support for doing evaluation as interpretation. This research presents a practical method for doing evaluation of information systems as a joint act of interpretation performed by the stakeholders of the information system in use. In our research we have expanded the interpretive philosophical base to embrace a pragmatic knowledge interest in order to underpin the overall strive for evaluation that is to contribute to change and betterment. The method presented is named VISU (Swedish acronym for IS evaluation for workpractice development). The process of evaluating accordingly to the VISU method has been extensively tested in practice and in theoretical grounding processes and is now considered ready for wider use. The research process for developing VISU has been conducted with canonical action research through parallel work with evaluation and method development in six episodes within two cases. VISU consists of prescribed actions that are anchored in a set of underlying principles stemming from the philosophy of American pragmatism. Evaluation according to VISU is performed in three phases; arrange, evaluate and develop. In the paper VISU is described according to phases, actions, main concepts and principles. The use of VISU is demonstrated through examples from a performed evaluation of an information system in support for social welfare services.

 

Keywords: IS evaluation, stakeholder model, interpretive IS evaluation method, pragmatism, action research

 

Share |

Journal Article

A Holistic Framework on Information Systems Evaluation with a Case Analysis  pp57-64

Petri Hallikainen, Lena Chen

© Nov 2006 Volume 9 Issue 2, Editor: Dan Remenyi, pp45 - 104

Look inside Download PDF (free)

Abstract

This paper presents a framework for understanding IS evaluation in its broader context. The role of IS evaluation is emphasised on integrating the IS development process into business development process. The framework is applied to analyze a single IS project in details. The results show that sometimes formal IS evaluation might not be important or necessary, but rather it may be more important, with an informal and flexible evaluation process, to quickly gain experience of a new kind of business and system to maintain a leading position in the competitive market.

 

Keywords: information systems projects, IS evaluation, organisational context, holistic framework on IS evaluation

 

Share |