The Electronic Journal of Information Systems Evaluation provides critical perspectives on topics relevant to Information Systems Evaluation, with an emphasis on the organisational and management implications
For general enquiries email administrator@ejise.com
Click here to see other Scholarly Electronic Journals published by API
For a range of research text books on this and complimentary topics visit the Academic Bookshop

Information about the European Conference on Information Management and Evaluation is available here

linkedin-120 

twitter2-125 

fb_logo-125 

 
Journal Issue
Volume 20 Issue 2 / Nov 2017  pp59‑141

Editor: Shaun Pather

Download PDF (free)

EJISE Editorial for Volume 20 Issue 2  pp59‑60

Shaun Pather

Look inside Download PDF (free)

The Status of Integration of Health Information Systems in Namibia  pp61‑75

Nomusa Dlodlo, Suama Hamunyela

Look inside Download PDF (free)

E‑Supply Chain Coordination and SME Performance: An Empirical Investigation  pp76‑84

Dr Rui Bi

Look inside Download PDF (free)

Impact of the Three IS Qualities On User Satisfaction in an Information‑Intensive Sector  pp85‑101

Sylvie Michel, François Cocula

Look inside Download PDF (free)

Making Sense of E‑Commerce Customers Awareness in a Developing Country Context: A Framework for Evaluation  pp102‑115

Husam Yaseen, Moh’d Alhusban, Amal Alhosban, Kate Dingley

Look inside Download PDF (free)

Designing Measurement Tools to Improve Response Fluency and Certainty: The Case of Online Customer Satisfaction Evaluation  pp116‑127

Alice Audrezet, Béatrice Parguel

Look inside Download PDF (free)

Abstract

Online shopping development went hand in hand with online self‑administered customer satisfaction evaluation requirement. However, the specific context of online rating, without any face‑to‑face clarification, raises the question of accuracy and appropriateness of the chosen tool for respondents. To address this issue, this research proposes the new concept of “response fluency” to qualify the ease with which a question is processed. Applied to the Evaluative Space Grid, a new grid that has been proposed in psychology to measure overall evaluation, this research shows how response fluency mediates the influence of measurement tool design on response certainty. More specifically, it tests the effects of two alternative tool design formats (i.e., a reduction of the grid’s response cell number and the display of labels to the response cells) in terms of response fluency and certainty. Using a between‑subjects experiment, we show that the display of labels in the cells actually increases response fluency and, in turn, response certainty. By contrast, reducing the response cell number does not produce any effect. We contend that well‑designed measurement tools can make the process of responding more fluent and increase respondents’ subjective confidence in their capability of conveying their true evaluations. In the end, this work advocates for new research to design measurement tools likely to engage respondents when answering surveys and prevent dropout rates, which is especially challenging within self‑administered electronic settings. 

 

Keywords: attitude certainty, processing fluency, web evaluation, online data collection, tool design, instrument

 

Share |
Management Research on Social Networking Sites: State of the Art and Further Avenues of Research  pp128‑141

Marwa Mallouli, Zouhour Smaoui Hachicha, Jamil Chaabouni

Look inside Download PDF (free)