The Electronic Journal of Information Systems Evaluation provides critical perspectives on topics relevant to Information Systems Evaluation, with an emphasis on the organisational and management implications
For general enquiries email administrator@ejise.com
Click here to see other Scholarly Electronic Journals published by API
For a range of research text books on this and complimentary topics visit the Academic Bookshop

Information about the European Conference on Information Management and Evaluation is available here

linkedin-120 

twitter2-125 

fb_logo-125 

 

Journal Article

Seven Ways to get Your Favoured IT Project Accepted — Politics in IT Evaluation  pp31-40

Egon Berghout, Menno Nijland, Kevin Grant

© Jan 2005 Volume 8 Issue 1, Editor: Dan Remenyi, pp1 - 80

Look inside Download PDF (free)

Abstract

IS managers are being put under increasing pressure to justify the value of corporate ITIS expenditure. Their constant quest for the 'holy grail' continues, as existing methods and approaches of justifying ITIS expenditure are still failing to deliver. The decision making process is not as objective and transparent as it is claimed or intended to be. This paper discusses seven commonly used tactics used by business managers to influence IT appraisals. The paper takes a 'devil's advocate' position and adopts some irony when looking at the area of power and politics in IT evaluation. Rather than promoting the use of these techniques, this article aims to raise awareness that IT evaluation is not as rational as most IT evaluation researcherspractitioners would want it to be or indeed claim it to be. It is argued that rationalisation or counter tactics may counteract influence techniques in an attempt to get behind the cloak and dagger side of organisational power and politics, but politics and power in decision‑making cannot and should not be filtered out. Due to dissimilarities of objectives, limitations of time and information, influence techniques will always be used. However, rather than being counterproductive, these techniques are essential in the process of decision making of IT projects. They help organisations reach better decisions, which receive more commitment than decisions that were forced to comply with strictly rational approaches. Awareness of the influence and manipulation techniques used in practice will help to deal with power and politics in IT evaluation and thereby come to better IT investment decisions.

 

Keywords: IT Evaluation, IT Decision Making, IT Assessment, Information Economics, Decision Making, Organisational Power & Politics Information Management

 

Share |

Journal Article

Peer Assessment: A Complementary Instrument to Recognise Individual Contributions in IS Student Group Projects  pp61-70

Elsje Scott, Nata van der Merwe, Derek Smith

© Jan 2005 Volume 8 Issue 1, Editor: Dan Remenyi, pp1 - 80

Look inside Download PDF (free)

Abstract

This paper discusses peer assessment as a component of the assessment strategy used for Information Systems student group projects at a South African university. The value of peer assessment and the contribution to the real‑life experience offered by group projects, will be discussed. It will also illustrate how this process adds value by enhancing deep learning. Its value as a complementary assessment instrument in a multiple assessment strategy and how the results of peer assessment are used to recognise individual contributions to group performance will be illustrated. The use of peer assessment as an instrument for both informal formative assessment and formal summative assessment will be described. To perform the peer assessment specific instruments were designed and used throughout the lifecycle of the course.

 

Keywords: Peer assessment, group work, assessment, self-assessment, IS Project

 

Share |

Journal Article

A Process Capability Approach to Information Systems Effectiveness Evaluation  pp7-14

Sevgi Ozkan

© Mar 2006 Volume 9 Issue 1, Editor: Dan Remenyi, pp1 - 43

Look inside Download PDF (free)

Abstract

While defining or measuring the effectiveness of the information systems (IS) function has proven complicated, further effort on refining IS assessment is essential for the effective management and continuous improvement of both the IS function and the organisation. In addition, an effort to investigate the relationships among the established IS assessment tools to better reconcile their existing differences is warranted. This paper aims to clearly differentiate the notions of 'Software' from 'Information Systems'. A new IS assessment model is proposed to provide a more holistic view on how IS quality may be assessed by means of a process capability understanding of evaluating IS effectiveness within the organisational context.

 

Keywords: Information systems quality, Information systems effectiveness, Assessment, Software process maturity, Process capability

 

Share |

Journal Article

Proposal of a Compact IT Value Assessment Method  pp73-82

Przemyslaw Lech

© Jan 2007 Volume 10 Issue 1, ECITE 2006 Special, Editor: Dan Remenyi, pp1 - 122

Look inside Download PDF (free)

Abstract

This paper contains a proposal of a compact IT value assessment method. It follows the assumption that most methods available for the public are either described in a very general manner or concentrate on one of the evaluation aspects only. The proposed method relates the evaluation approach to the main IT initiative characteristics, such as the investment purpose and IT element to be implemented. Based on these criteria, the evaluation process is shaped by putting emphasis on the relevant evaluation aspects and choosing the relevant evaluation methods. The method design is focused on the ease of use and practical relevance so it can be used by IT practitioners to assess IT initiatives in their organisations. The paper finishes with the case study of the method usage in a mid‑sized production enterprise.

 

Keywords: IT value assessment, IT evaluation, practical method, case study

 

Share |

Journal Article

Assessing Information Management Competencies in Organisations  pp179-192

Andy Bytheway

© Sep 2011 Volume 14 Issue 2, ICIME 2011, Editor: Ken Grant, pp167 - 281

Look inside Download PDF (free)

Abstract

The history of the management of information systems includes many ideas that were intended to simplify the complexities of the management task, but there is still a great deal of wasted investment that produces no significant benefits. Much of the thinking has been rational and structured, but it can be argued that structured thinking will not solve the problems presented by the ever‑increasing scope and depth of information systems, the need for improved responsiveness and agility, and the need to deal with a range of requirements that are sometimes behavioural and sometimes legislative. Three of the more frequently cited frameworks for information management (Zachman, Henderson & Venkatraman, Ward), are briefly reviewed and found to have common characteristics. They are combined into a new, simple arrangement of the central (and critically important) ideas. This new framework has been used as the basis of a survey instrument that is introduced and explained; it works at two levels ‑ the "micro" and "macro" levels. It assesses perceptions of organisational capability to manage information well, as seen by respondents who are normally employees working in different roles with varying responsibilities. The survey instrument comes with an analysis and reporting package that is found to be suitable for the needs of busy managers, and the way in which micro and macro data is presently analysed and presented is demonstrated using data from a reference dataset, a CIO workshop, an investigation within a real estate agency and a large financial services organisation. The contribution of this work to the research programme from which it emanated is summarised and future directions briefly explained.

 

Keywords: information management, perceptions, IS/IT strategy, alignment, assessment

 

Share |

Journal Issue

Volume 9 Issue 2 / Nov 2006  pp45‑104

Editor: Dan Remenyi

View Contents Download PDF (free)

Editorial

Once again we have received an interesting range of research papers from authors around the world and furthermore they continue to represent a very wide range of thought with regards to the different applications of evaluation thinking for information and communication technology. It is clear that this field has not yet produced a clear consensus as to any particular methodology and I for one believe that this is what one might loosely call a “good thing”.

Six papers have been selected by our reviewers through the process or double‑blind peer review and this has produced six very interesting and yet different papers from authors in Sweden, Spain, The Netherlands, Ireland and Greece.

I trust readers will find these pieces of research as interesting as I have.

 

Keywords: IS integration, activity-based costing, assessment, business evaluation, cost management systems, e-business, e-commerce, enterprise modelling, evaluation framework, event study methodology, information systems effectiveness, information systems management, information systems quality, information technology productivity paradox, internet business, IS success, IT investment, process capability, project portfolio, risk management, software process maturity, system analysis metrics, value-at-risk, web-facilitated business

 

Share |

Journal Issue

Volume 10 Issue 1, ECITE 2006 Special / Jan 2007  pp1‑122

Editor: Dan Remenyi

View Contents Download PDF (free)

Editorial

Another edition of EJISE brings to the attention of the information systems community 10 more pieces of research into how information systems may be evaluated. The contributions in this issue are from 9 different countries and from a diverse range of universities and business schools.

When I first became actively interested in information systems’ evaluation in 1990 I had no idea of how wide and how deep an issue information systems evaluation was. I had thought that it was worth a few papers and maybe a book or two. Today my view is entirely different and I wonder if the community of information systems academics and practitioners will ever reach a point where by there will be a general agreement as to how to evaluate or assess information systems. My best guess would be that they probably will not.

However as it was put to me at the start of my university studies academics tend to have far more questions than answers and this may not necessarily be a ‘bad’ thing. If we continue to ask the right questions, even if we can’t find definitive answers we are effectively moving the frontier of knowledge forward. And that I suggest is, in the end, the most important objective of academe.

I hope that you will find a number of interesting topics among these 10 papers.

 

Keywords: IS integration, auditing, balanced score card, business process facilitation, case study, confidentiality, domain specific languages, e-Government project evaluation, enterprise information system, CEO framework, ex post evaluation, functional-operational match, ICT benefits, ICT evaluation, ICT project, information economics, Information System Architecture , IS outsourcing , IT evaluation, IT value assessment, knowledge management, meta-modelling tools, motivational factors, user satisfaction surveys, web content management, WLAN

 

Share |