The Electronic Journal of Information Systems Evaluation provides critical perspectives on topics relevant to Information Systems Evaluation, with an emphasis on the organisational and management implications
For general enquiries email administrator@ejise.com
Click here to see other Scholarly Electronic Journals published by API
For a range of research text books on this and complimentary topics visit the Academic Bookshop

Information about the European Conference on Information Management and Evaluation is available here

linkedin-120 

twitter2-125 

fb_logo-125 

 
Journal Issue
Volume 16 Issue 1, ECIME 2012 / Jun 2013  pp1‑84

Editor: Dr. David Sammon, Dr. Tadhg Nagle

Download PDF (free)

Editorial of the ECIME 2012 Special Issue of EJISE  pp1‑2

Dr. David Sammon, Dr. Tadhg Nagle

Look inside Download PDF (free)

IT Risk Management: A Capability Maturity Model Perspective  pp3‑13

Val Hooper anMarian Carcaryd Tarika Kalidas

Look inside Download PDF (free)

Do IS consultants enhance IS competences in SMEs?  pp14‑25

Adrian Bradshaw, Paul Cragg, Venkat Pulakanam

Look inside Download PDF (free)

Exploring the Alignment of Organisational Goals with KM: Cases in Four Irish Software SMEs  pp26‑37

Ciara Heavin, Frederic Adam

Look inside Download PDF (free)

Information System Evaluation through an Emergence Lens  pp38‑47

Olgerta Tona, Sven A. Carlsson

Look inside Download PDF (free)

A novel approach to challenging consensus in evaluations: The Agitation Workshop  pp48‑58

John McAvoy, Tadhg Nagle, David Sammon

Look inside Download PDF (free)

Abstract

Abstract: As researchers evaluate organisations, projects, and teams, there is a desire for a consensus from those within the organisations who are participating in the research. A common consensual perspective from a team appears to reflect an optimal st ate where those being evaluated have a common understanding of the current state of events within the context of their environment. The question arises, though, whether an evaluation finding consensus reflects the reality: there are a variety of reasons w hy a common understanding may be false consensus. Hidden behind this false consensus may be a variety of unaddressed issues which are actually the core of the problem. This paper proposes an evaluation method incorporating the principles of sensemaking an d devils advocate, where a consensus of perspectives is challenged before they are considered valid. This is achieved in a workshop where participants reflect on their own perception of reality and represent this reality in a matrix of influencing and re levant factors. The individual matrices are then combined and used to highlight disparities in the participants perspectives through a single matrix visualisation. Discussion in the workshop then focusses on the areas, highlighted by the matrix, where di fferences of perspectives are identified. In effect, the consensus presented by those being evaluated will be challenged, and a new common understanding will have to be created. Problems such as groupthink can create a false consensus, and it is proposed herein that the workshop provides a mechanism for challenging this. The objective of the research herein was to determine the feasibility and potential benefits of the proposed workshop. The workshop itself is evaluated in this paper, to determine if it has value. The benefits of such a workshop are described, showing how an organisation went from a false consensus concerning problems within the organisation, to the start of a process to address the real underlying issues. 

 

Keywords: Keywords: consensus, false consensus, workshop, groupthink, evaluation, hidden, sensemaking, shared understanding

 

Share |
Evaluation of a Collaborative Learning Environment on a Facebook Forum  pp59‑73

M.R. (Ruth) de Villiers, Marco Cobus Pretorius

Look inside Download PDF (free)

The Role of Agent Based Modelling in the Design of Management Decision Processes  pp74‑84

Elena Serova

Look inside Download PDF (free)