Evaluating HCI research beyond usability

Remy, Christian and Bates, Oliver and Mankoff, Jennifer and Friday, Adrian (2018) Evaluating HCI research beyond usability. In: CHI 2018 - Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems. Conference on Human Factors in Computing Systems - Proceedings . Association for Computing Machinery (ACM), CAN. ISBN 9781450356206

[img]
Text (2018_chi-sig-eval)
2018_chi_sig_eval.pdf - Accepted Version
Available under License Creative Commons Attribution-NonCommercial.

Download (158kB)

Abstract

Evaluating research artefacts is an important step to showcase the validity of a chosen approach. The CHI community has developed and agreed upon a large variety of evaluation methods for HCI research; however, sometimes those methods are not applicable or not sufficient. This is especially the case when the contribution lies within the context of the application area, such as for research in sustainable HCI, HCI for development, or design fiction and futures studies. In this SIG, we invite the CHI community to share their insights from projects that encountered problems in evaluating research and aim to discuss solutions for this difficult topic. We invite researchers from all areas of HCI research who are interested to engage in a debate of issues in the process of validating research artefacts.

Item Type:
Contribution in Book/Report/Proceedings
Additional Information:
© ACM, 2018. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in CHI 2018 - Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, 2018 http://doi.acm.org/10.1145/3170427.3185371
Uncontrolled Keywords:
/dk/atira/pure/subjectarea/asjc/1700/1712
Subjects:
ID Code:
135799
Deposited By:
Deposited On:
23 Aug 2019 08:35
Refereed?:
Yes
Published?:
Published
Last Modified:
25 Oct 2020 08:24