The Electronic Journal of Information Systems Evaluation provides critical perspectives on topics relevant to Information Systems Evaluation, with an emphasis on the organisational and management implications
For general enquiries email administrator@ejise.com
Click here to see other Scholarly Electronic Journals published by API
For a range of research text books on this and complimentary topics visit the Academic Bookshop

Information about the European Conference on Information Management and Evaluation is available here

linkedin-120 

twitter2-125 

fb_logo-125 

 

Journal Article

Evaluation of the Information Systems Research Framework: Empirical Evidence from a Design Science Research Project  pp158-168

Stefan Cronholm, Hannes Göbel

© Dec 2016 Volume 19 Issue 3, Editor: Shaun Pather, pp135 - 212

Look inside Download PDF (free)

Abstract

Abstract: The purpose of this paper is to provide empirical evidence that the design science framework Information System Research (ISR) works in practice. More than ten years has passed since ISR was published in the well‑cited article ‘Design Science in Information Systems Research’. However, there is no thoroughly documented evaluation of ISR based on primary data. That is, existing evaluations are based on reconstructions of prior studies conducted for other purposes. To use an existing data set to answer new or extended research questions means to conduct a secondary analysis. We point to several risks related to secondary analyses and claim that popular design science research frameworks should be based on primary data. In this paper, we present an evaluation consisting of empirical experiences based on primary data. We have systematically collected experiences from a three‑year research project and we present ting of both strengths and weaknesses are presented. The main strengths are: the bridging of the contextual environment with the design science activities and the rigorousness of testing IT artefacts. The main weaknesses are: imbalance in support for making contributions to both theory and practice, and ambiguity concerning the practitioners’ role in design and evaluation of artefacts. We claim that the identified weaknesses can be used for further development of frameworks or methods concerning design science research.

 

Keywords: Keywords: design science, design science research, evaluation, empirical validation, secondary analysis, primary data

 

Share |

Journal Article

Development and Evaluation of an Automated e‑Counselling System for Emotion and Sentiment Analysis  pp1-19

Emmanuel Awuni Kolog, Calkin Suero Montero, Markku Tukiainen

© Feb 2018 Volume 21 Issue 1, Editor: Shaun Pather, pp1 - 45

Look inside Download PDF (free)

Abstract

Given the challenges associated with the analysis of emotions in text by counsellors, we present an intelligent e‑counselling system for automatic detection of emotions and sentiments in text. The system‑ EmoTect‑ was developed using a supervised support vector machine learning classifier. Therefore, students’ life stories were collected and developed into a corpus for training and evaluating of the classifier. EmoTect allows users to label instances of the training data based on their own perception of emotions, and then gradually learns to classify emotions according to the user’s perceptions. The EmoTect interface provides a visualization of the emotional changes from automatically analysed students’ submissions over a selectable period. In this paper, the EmoTect classifier is evaluated with a gold standard corpus obtained from students but annotated by counsellors. In addition to the classifier evaluation, the EmoTect prototype was evaluated with counsellors in their settings. From the experimental results, the EmoTect classifier for the sentiment classification achieved comparable accuracy to that achieved with a gold standard when presented with unknown data. The contextual evaluation of the system indicates counsellors' satisfaction and sense of enthusiasm for using EmoTect for counseling delivery.

 

Keywords: Counselling, Design science research, Emotion classification, Evaluation, Sentiment analysis, Support vector machine

 

Share |

Journal Issue

Volume 19 Issue 3 / Dec 2016  pp135‑212

Editor: Shaun Pather

View Contents Download PDF (free)

Editorial

Shaun_Pather‑200 Professor Shaun Pather, based in the Faculty of Informatics & Design at the Cape Peninsula University of Technology, Cape Town, South Africa, has spent more than 20 years teaching and researching in the field of ICT management.

His research has focused on the evaluation of Information Systems (IS) effectiveness, particularly within e‑Commerce, e‑Government and other web enabled contexts. He has developed models for evaluating e‑Commerce success, and also has an interest in the application of e‑Service Quality evaluation. Shaun has also extended his interest in IS evaluation into practical community engagement and Information Society issues, centered around societal upliftment facilitated by ICT’s. He has published in peer reviewed journals and has presented papers at several conferences. He has led several research projects with university and government partners in both the private and public sector. Professor Pather is also a Fulbright Scholar (University of Washington, 2009‑2010).

 

Keywords: Multi-channel, Electronic banking, Internet banking, Mobile banking, Technology Adoption, Grounded Theory, Design science, Design science research, evaluation, empirical validation, secondary analysis, primary data, business analysis, business architecture, parallelism, alignment, roles, responsibilities and organisational structure, Software Switching, Switching costs, Utilitarian Value, Hedonic Value, e-government, on-line tax filing, acceptance factors, personal innovativeness, computer self-efficacy, online trust, system quality, information system

 

Share |

Journal Issue

Volume 21 Issue 1 / Feb 2018  pp1‑45

Editor: Shaun Pather

View Contents Download PDF (free)

Editorial

 

Keywords: Counselling, Design science research, Emotion classification, Evaluation, Sentiment analysis, Support vector machine

 

Share |