The Electronic Journal of Information Systems Evaluation provides critical perspectives on topics relevant to Information Systems Evaluation, with an emphasis on the organisational and management implications
For general enquiries email administrator@ejise.com
Click here to see other Scholarly Electronic Journals published by API
For a range of research text books on this and complimentary topics visit the Academic Bookshop

Information about the European Conference on Information Management and Evaluation is available here

linkedin-120 

twitter2-125 

fb_logo-125 

 

Journal Article

Evaluation of the Information Systems Research Framework: Empirical Evidence from a Design Science Research Project  pp158-168

Stefan Cronholm, Hannes Göbel

© Dec 2016 Volume 19 Issue 3, Editor: Shaun Pather, pp135 - 212

Look inside Download PDF (free)

Abstract

Abstract: The purpose of this paper is to provide empirical evidence that the design science framework Information System Research (ISR) works in practice. More than ten years has passed since ISR was published in the well‑cited article ‘Design Science in Information Systems Research’. However, there is no thoroughly documented evaluation of ISR based on primary data. That is, existing evaluations are based on reconstructions of prior studies conducted for other purposes. To use an existing data set to answer new or extended research questions means to conduct a secondary analysis. We point to several risks related to secondary analyses and claim that popular design science research frameworks should be based on primary data. In this paper, we present an evaluation consisting of empirical experiences based on primary data. We have systematically collected experiences from a three‑year research project and we present ting of both strengths and weaknesses are presented. The main strengths are: the bridging of the contextual environment with the design science activities and the rigorousness of testing IT artefacts. The main weaknesses are: imbalance in support for making contributions to both theory and practice, and ambiguity concerning the practitioners’ role in design and evaluation of artefacts. We claim that the identified weaknesses can be used for further development of frameworks or methods concerning design science research.

 

Keywords: Keywords: design science, design science research, evaluation, empirical validation, secondary analysis, primary data

 

Share |

Journal Issue

Volume 19 Issue 3 / Dec 2016  pp135‑212

Editor: Shaun Pather

View Contents Download PDF (free)

Editorial

Shaun_Pather‑200 Professor Shaun Pather, based in the Faculty of Informatics & Design at the Cape Peninsula University of Technology, Cape Town, South Africa, has spent more than 20 years teaching and researching in the field of ICT management.

His research has focused on the evaluation of Information Systems (IS) effectiveness, particularly within e‑Commerce, e‑Government and other web enabled contexts. He has developed models for evaluating e‑Commerce success, and also has an interest in the application of e‑Service Quality evaluation. Shaun has also extended his interest in IS evaluation into practical community engagement and Information Society issues, centered around societal upliftment facilitated by ICT’s. He has published in peer reviewed journals and has presented papers at several conferences. He has led several research projects with university and government partners in both the private and public sector. Professor Pather is also a Fulbright Scholar (University of Washington, 2009‑2010).

 

Keywords: Multi-channel, Electronic banking, Internet banking, Mobile banking, Technology Adoption, Grounded Theory, Design science, Design science research, evaluation, empirical validation, secondary analysis, primary data, business analysis, business architecture, parallelism, alignment, roles, responsibilities and organisational structure, Software Switching, Switching costs, Utilitarian Value, Hedonic Value, e-government, on-line tax filing, acceptance factors, personal innovativeness, computer self-efficacy, online trust, system quality, information system

 

Share |