The Electronic Journal of Information Systems Evaluation provides critical perspectives on topics relevant to Information Systems Evaluation, with an emphasis on the organisational and management implications
For general enquiries email administrator@ejise.com
Click here to see other Scholarly Electronic Journals published by API
For a range of research text books on this and complimentary topics visit the Academic Bookshop

Information about the European Conference on Information Management and Evaluation is available here

linkedin-120 

twitter2-125 

fb_logo-125 

 
Journal Issue
Volume 19 Issue 3 / Dec 2016  pp135‑212

Editor: Shaun Pather

Download PDF (free)

EJISE Editorial for Volume 19 Issue 3 2016  pp135‑136

Shaun Pather

Look inside Download PDF (free)

Towards a Theory of Multi‑Channel Banking Adoption amongst Consumers  pp137‑157

Kunal Patel, Irwin Brown

Look inside Download PDF (free)

Evaluation of the Information Systems Research Framework: Empirical Evidence from a Design Science Research Project  pp158‑168

Stefan Cronholm, Hannes Göbel

Look inside Download PDF (free)

Abstract

Abstract: The purpose of this paper is to provide empirical evidence that the design science framework Information System Research (ISR) works in practice. More than ten years has passed since ISR was published in the well‑cited article ‘Design Science in Information Systems Research’. However, there is no thoroughly documented evaluation of ISR based on primary data. That is, existing evaluations are based on reconstructions of prior studies conducted for other purposes. To use an existing data set to answer new or extended research questions means to conduct a secondary analysis. We point to several risks related to secondary analyses and claim that popular design science research frameworks should be based on primary data. In this paper, we present an evaluation consisting of empirical experiences based on primary data. We have systematically collected experiences from a three‑year research project and we present ting of both strengths and weaknesses are presented. The main strengths are: the bridging of the contextual environment with the design science activities and the rigorousness of testing IT artefacts. The main weaknesses are: imbalance in support for making contributions to both theory and practice, and ambiguity concerning the practitioners’ role in design and evaluation of artefacts. We claim that the identified weaknesses can be used for further development of frameworks or methods concerning design science research. 

 

Keywords: Keywords: design science, design science research, evaluation, empirical validation, secondary analysis, primary data

 

Share |
The overlapping nature of Business Analysis and Business Architecture: what we need to know  pp169‑179

Tiko Iyamu, Monica Nehemia-Maletzky, Irja Shaanika

Look inside Download PDF (free)

What won’t Google users switch to Bing? Understanding factors that promote and barriers that prevent software users from switching  pp180‑196

Dr. Adarsh Kumar Kakar

Look inside Download PDF (free)

Trust and e‑government acceptance: The case of Tunisian on‑line tax filing  pp197‑212

Majdi Mellouli, Omar Bentahar, Marc Bidan

Look inside Download PDF (free)