User menu

Accès à distance ? S'identifier sur le proxy UCLouvain

An information system design theory for the comparative judgement of competences

  • Open access
  • DOC
  • 1.59 M
  1. 10.1111/emip.2012.31.issue-3
  2. Bramley T., Investigating the reliability of adaptive comparative judgment. Cambridge assessment research report (2015)
  3. Brooke J., Usability Evaluation in Industry, 189, 4 (1996)
  4. Clegg D., Case method fast-track: A rad approach (1994)
  5. Davis Fred D., Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology, 10.2307/249008
  6. Dresch A, Design science research: A method for science and technology advancement (2014)
  7. Greatorex J, Proceedings of the 2007 BERA conference (2007)
  8. MIS Quarterly, 10.25300/misq
  9. Jones David, , Gregor Shirley, , The Anatomy of a Design Theory, 10.17705/1jais.00129
  10. Heldsinger Sandra, Humphry Stephen, Using the method of pairwise comparison to obtain reliable teacher assessments, 10.1007/bf03216919
  11. Hevner A. R., Scandinavian Journal of Information Systems, 19, 87 (2007)
  12. Hevner, March, Park, Ram, Design Science in Information Systems Research, 10.2307/25148625
  13. Iivari Juhani, Distinguishing and contrasting two strategies for design science research, 10.1057/ejis.2013.35
  14. Jones Ian, Alcock Lara, Peer assessment without assessment criteria, 10.1080/03075079.2013.821974
  15. Jones Ian, Swan Malcolm, Pollitt Alastair, ASSESSING MATHEMATICAL PROBLEM SOLVING USING COMPARATIVE JUDGEMENT, 10.1007/s10763-013-9497-6
  16. Jonsson Anders, Svingby Gunilla, The use of scoring rubrics: Reliability, validity and educational consequences, 10.1016/j.edurev.2007.05.002
  17. Laming Donald, The Reliability of a Certain University Examination Compared with the Precision of Absolute Judgements, 10.1080/14640749008401220
  18. Laming D., Human judgment: The eye of the beholder (2003)
  19. Messick Samuel, Meaning and Values in Test Validation: The Science and Ethics of Assessment, 10.3102/0013189x018002005
  20. Journal of the Association for Information Systems, 10.17705/1jais
  21. Peffers Ken, Tuunanen Tuure, Rothenberger Marcus A., Chatterjee Samir, A Design Science Research Methodology for Information Systems Research, 10.2753/mis0742-1222240302
  22. Pollitt, A. (2004). Let’s stop marking exams. Proceedings of the 2004 IAEA Conference.
  23. Pollitt Alastair, Comparative judgement for assessment, 10.1007/s10798-011-9189-x
  24. Pöppelbuß J., 12th International Conference on Wirtschaftsinformatik, 1557 (2015)
  25. Sadler D. Royce, Indeterminacy in the use of preset criteria for assessment and grading, 10.1080/02602930801956059
  26. Sauro J., Quantifying the user experience: Practical statistics for user research (2012)
  27. Schwaber K., Agile project management with Scrum (2004)
  28. Sein, Henfridsson, Purao, Rossi, Lindgren, Action Design Research, 10.2307/23043488
  29. Shaw Stuart, Crisp Victoria, Johnson Nat, A framework for evidencing assessment validity in large-scale, high-stakes international examinations, 10.1080/0969594x.2011.563356
  30. Thurstone L. L., A law of comparative judgment., 10.1037/h0070288
Bibliographic reference Coenen, Tanguy ; Coertjens, Liesje ; Vlerick, Peter ; Lesterhuis, Marije ; Mortier, Anneleen Viona ; et. al. An information system design theory for the comparative judgement of competences. In: European Journal of Information Systems, Vol. 27, no.2, p. 248-261 (2018)
Permanent URL http://hdl.handle.net/2078.1/201371