The Electronic Journal of Information Systems Evaluation provides critical perspectives on topics relevant to Information Systems Evaluation, with an emphasis on the organisational and management implications
For general enquiries email administrator@ejise.com
Click here to see other Scholarly Electronic Journals published by API
For a range of research text books on this and complimentary topics visit the Academic Bookshop

Information about the European Conference on Information Management and Evaluation is available here

linkedin-120 

twitter2-125 

fb_logo-125 

 

Journal Article

IS Evaluation in Practice  pp169-178

Ann Brown

© Jan 2006 Volume 8 Issue 3, ECITE 2005 Special, Editor: Dan Remenyi, pp143 - 230

Look inside Download PDF (free)

Abstract

IS evaluation exercises continued to engender scepticism. The evaluation of IS investment is considered a 'wicked problem' and there are good reasons for this judgement. The topic has attracted many researchers. There is a substantial body of literature on the problems of measurement and the inadequacies of traditional investment appraisal methods. A wide range of alternative tools has been proposed to replace these approaches. But many surveys of actual practice have established little evidence of their use. Reported IS evaluation practice appears to be relatively unsophisti‑ cated or absent in many organisations. This paper draws on existing literature and case material to analyse the problem facing organisations when planning an IS evaluation exercise. It argues that the factors that can undermine the effectiveness of IS evaluation projects pose ma‑ jor problems. Management apathy may be a rational response to a complex and difficult exercise that often yields little benefit to the organisation.

 

Keywords: IS evaluation, Failure-prone decision process, IS Business value, IS evaluation project

 

Share |

Journal Article

The Adoption of new Application Development Tools by IT Pro‑fessionals from the Viewpoint of Organisational Learning  pp197-206

Torsti Rantapuska

© Jan 2006 Volume 8 Issue 3, ECITE 2005 Special, Editor: Dan Remenyi, pp143 - 230

Look inside Download PDF (free)

Abstract

Productivity and innovativeness of information work is becoming an important issue among information work‑ers. This paper explores the working and learning of IS professionals when adopting new application development tools. I study how the IS professionals work, communicate, think through problems, and learn by way of getting work done. I also analyse the changes that the adoption causes to the individual style of working. The research questions are formu‑ lated as follows: 1) what contributes to the effective use of IT tools? 2) How does the adoption of new tools affect the individual working methods? The research is based on interviews of fourteen young professionals who have recently started using a new application development tool. The interviews have been conducted in their working places. The fo‑ cus is on learning at work. Special attention is paid to the initial motivation of the innovation, to knowledge acquisition, and to communication with their team members during the problem solving process. According to the findings, the IS professionals' working style is personal and context‑oriented. As learners they do not interact with their peers and do not use systematic working methods too much. The Internet and help systems are used as the basis of group interaction and source of knowledge more likely than colleagues and textbooks. The systematic orientation of working practice is limited to the context at hand. At the end of the study, the results are discussed and recommendations are proposed to improve the software process.

 

Keywords: software process innovations, organisational learning, adoption, individual learning styles

 

Share |

Journal Article

A Process Capability Approach to Information Systems Effectiveness Evaluation  pp7-14

Sevgi Ozkan

© Mar 2006 Volume 9 Issue 1, Editor: Dan Remenyi, pp1 - 43

Look inside Download PDF (free)

Abstract

While defining or measuring the effectiveness of the information systems (IS) function has proven complicated, further effort on refining IS assessment is essential for the effective management and continuous improvement of both the IS function and the organisation. In addition, an effort to investigate the relationships among the established IS assessment tools to better reconcile their existing differences is warranted. This paper aims to clearly differentiate the notions of 'Software' from 'Information Systems'. A new IS assessment model is proposed to provide a more holistic view on how IS quality may be assessed by means of a process capability understanding of evaluating IS effectiveness within the organisational context.

 

Keywords: Information systems quality, Information systems effectiveness, Assessment, Software process maturity, Process capability

 

Share |

Journal Article

An Interactive and Iterative Evaluation Approach for Creating Collaborative Learning Environments  pp83-92

Anita Mirijamdotter, Mary M. Somerville, Marita Holst

© Nov 2006 Volume 9 Issue 2, Editor: Dan Remenyi, pp45 - 104

Look inside Download PDF (free)

Abstract

Inspired by a three‑year Creative University 'arena' initiative at Luleå University of Technology in Sweden, an international team of faculty researchers conducted an exploratory study in 2005, which aimed to investigate the efficacy of an interactive design and evaluation process for technology‑enabled collaborative learning environments. This applied research approach was designed as a collaborative evaluation process for co‑creation of technology‑enabled, learning‑ focused physical and virtual 'learning commons.' Faculty researchers from Sweden and the United States used Soft Systems Methodology tools, including the Process for Organisational Meanings (POM) model, to guide sixty‑two students' participatory co‑design and evaluation activities. In this paper, the POM evaluation model is explained and related to the Japanese concept Ba. Application of the models is illustrated within the context of student learning through boundary crossing information exchange and knowledge creation. As evidenced in their iterative and interactive evaluative recommendations, students' learning outcomes included development of improved capabilities for identifying socio‑technical elements of distributed learning environments, suggesting that student beneficiaries can successfully reflect upon their experiences and provide valuable evaluation insights. In addition, when this evaluation is iterative, students' insights into project management, software needs, and services design can improve their technology‑enabled learning experiences. Concluding comments explore the efficacy of the POM model implementation for guiding other learning‑focused, user‑centric initiatives, which aim to promote interdisciplinary, or boundary crossing, exchanges concurrent with advancing team‑based knowledge creation proficiencies among project participants.

 

Keywords: interactive formative evaluation, learning commons, soft systems methodology, process for organisational meanings, POM, model, Ba, higher education pedagogy

 

Share |

Journal Article

Causal Relationships between Improvements in Software Development Processes and Final Software Product Quality  pp1-10

Rini van Solingen, Egon Berghout

© Mar 2008 Volume 11 Issue 1, Editor: Dan Remenyi, pp1 - 51

Look inside Download PDF (free)

Abstract

A main assumption of software process improvement (SPI) is that improvements in a software development process result in higher quality software products. In other words, SPI assumes the existence of causal relations between process and product characteristics. To what extent have these causal relations, however, been explored? Which specific process improvements have which particular impact on which particular product quality attributes? In this paper an overview is given of these "software process and product dependencies" (PPD). This overview comprises of a list of SPI‑techniques and the associated product quality attributes that are addressed with these techniques. The extent of the causality is investigated and whether there is a possibility to identify more or less effective strategies for product quality improvement. The overview is based on a literature study and expert evaluation. The research is summarised in a matrix of both software process elements and associated software product quality characteristics. This matrix contains both satisfactory and unsatisfactory results. On the one hand, a promising extensive base of publications on techniques and methods was identified. On the other, a disappointing deficiency of empirical validation regarding the actual impact of those techniques on product quality is also prominent. As it is, we remain with an inadequate and incomplete indication of the product characteristics that particular software process improvement techniques intend to ameliorate. This article, therefore, hopefully, also provides a basis for discussion on the need to make process‑product dependencies more explicit.

 

Keywords: software development, software process improvement, learning, product-process dependencies, PPD

 

Share |

Journal Article

Interpretative IS Evaluation: Results and Uses  pp97-108

Jenny Lagsten, Göran Goldkuhl

© Jun 2008 Volume 11 Issue 2, Editor: Dan Remenyi, pp51 - 108

Look inside Download PDF (free)

Abstract

One major reason for doing evaluations of information systems is to take actions based on the results of the evaluation. In order to make better use of interpretive evaluation processes in practice we need to understand what kinds of results such evaluations produce and the way that the results are used to be transformed into change and betterment in the organisation. We have developed, applied and studied a methodology in support for doing interpretive evaluation. In the paper we report the case of a performed action research study that has comprised an IS evaluation. Through this action research we have transformed the theoretical principles of the interpretive approach into a useful evaluation methodology in practice. The main emphasis in this study is on the results and the uses of the evaluation process. We make a brief theoretical overview of interpretive principles for IS evaluation and of the research on evaluation use, from the field of evaluation theory, and represent a framework for analysing influences from evaluation efforts. We use this framework to analyse and identify the results and uses of the performed evaluation in order to shed light on what kinds of results that interpretive evaluation may offer. We experienced the influence framework useful for locating and understanding the variety of results from interpretive evaluation processes. We conclude with a model depicting results and uses from interpretive IS evaluation processes. The main point we elaborate on in this paper is how evaluations influence the actions taken in the organisation in order to establish betterment. How people in the organisation use evaluation in order to establish betterment and change. Further we bounce back the insights on evaluation results and uses into the discussion on how to design interpretive evaluation processes and how to design evaluation methodology in support for those processes.

 

Keywords: IS evaluation, evaluation process, evaluation results, evaluation use, interpretative evaluation methodology

 

Share |

Journal Article

Is a Multi‑Criteria Evaluation Tool Reserved for Experts?  pp151-162

C. Sanga, I. M Venter

© Feb 2010 Volume 12 Issue 2, Editor: Shaun Pather, pp129 - 198

Look inside Download PDF (free)

Abstract

The objective of this investigation was to determine whether the analytical hierarchy process algorithm is suitable for the evaluation of software by evaluators with little Information Technology experience. The scope of the research was the evaluation of two free and open source e‑learning systems at the Open University of Tanzania using 33 stakeholders with diverse levels of Information Technology experience. Both quantitative and qualitative research methods were used. The qualitative methods comprised participative observation and interviews. Questionnaires and the analytical hierarchy process, a multiple‑criteria decision‑ making algorithm, represented the quantitative methods. The results showed that of the two e‑learning systems evaluated, Moodle was preferred over ATutor. Furthermore it was found that the analytical hierarchy process algorithm is appropriate for the evaluation of software in a situation where Information Technology experience is limited. It is anticipated that the paper contributes to the theory and practise of decision making in developing countries such as Tanzania.

 

Keywords: free and open source software, e-learning systems, software quality, multi-criteria evaluation tool, analytical hierarchy process, novice user, developing country

 

Share |

Journal Article

Alliance Decision Making of SMEs  pp13-26

Karla Diaz, Ute Rietdorf, Utz Dornberger

© Jan 2011 Volume 14 Issue 1, ECIME 2010 Special Issue, Editor: Miguel de Castro Neto, pp1 - 166

Look inside Download PDF (free)

Abstract

Hardly a sector of economic activity has remained untouched by the trend of inter‑firm collaboration, particularly among large enterprises, but it seems to remain uncommon among many SMEs especially in some developing countries. The advantages of the SMEs of being faster and flexible are clouded by the lack of resources and skills to develop businesses in the network. Successful development in some economies, mainly in Asia, has been based on effective linkage participation of SMEs as a strategy to cover the scarcities they face. This strategy is now playing an important role on the agenda in many countries in Latin America, but there is still a lack of information to make this strategy more popular among SMEs in these countries. Traditional literature in developed countries has been focused on large companies to explain what makes an alliance successful, how the relationship alliance partners should be, which structure of the alliance or the type of contract may make or break an alliance but, few researches have explored alliance as a strategy to develop SMEs. The critical role of decision‑making process regarding to the choice of being engaging into an alliance deserves particular research attention. This paper is focused on the alliance decision making process with specially emphasis on SMEs. The main contribution is to provide a framework of different factors that have influenced alliance decision making process. Based on Social Capital and Social Exchange, this research concentrates his analysis on a sample of SMEs from Mexico in which both, experienced and inexperienced alliances entrepreneurs, were considered. Our proposal included twelve variables which were analyzed to find their impact on the alliance decision making. The results show that the internal alliance initiative, frequently enterprise diagnose, trust based on partners’ prestige and smaller or similar characteristics of potential partners have strong influence on positive alliance decision making. Opposite expected characteristics were found between alliance experienced entrepreneurs and alliance inexperienced entrepreneurs.

 

Keywords: Alliances, decision-making, factors, process, SMEs

 

Share |