The Electronic Journal of e-Learning provides perspectives on topics relevant to the study, implementation and management of e-Learning initiatives
For general enquiries email administrator@ejel.org
Click here to see other Scholarly Electronic Journals published by API
For a range of research text books on this and complimentary topics visit the Academic Bookshop

Information about the current European Conference on e-Learning is available here

For infomation on the European Conference on Games Based Learning clickhere

SCImago Journal & Country Rank
 

Journal Article

Assessment in Massive Open Online Courses  pp207-216

Wilfried Admiraal, Bart Huisman, Olga Pilli

© Apr 2015 Volume 13 Issue 4, ECEL 2014, Editor: Kim Long, pp205 - 315

Look inside Download PDF (free)

Abstract

Abstract: Open online distance learning in higher education has quickly gained popularity, expanded, and evolved, with Massive Open Online Courses (MOOCs) as the most recent development. New web technologies allow for scalable ways to deliver video lect ure content, implement social forums and track student progress in MOOCs. However, we remain limited in our ability to assess complex and open‑ended student assignments. In this paper, we present a study on various forms of assessment and their relationsh ip with the final exam score. In general, the reliability of both the self‑assessments and the peer assessments was high. Based on low correlations with final exam grades as well as with other assessment forms, we conclude that self‑assessments might not be a valid way to assess students performance in MOOCs. Yet the weekly quizzes and peer assessment significantly explained differences in students final exam scores, with one of the weekly quizzes as the strongest explanatory variable. We suggest that both self‑assessment and peer assessment would better be used as assessment for learning instead of assessment of learning. Future research on MOOCs implies a reconceptualization of education variables, including the role of assessment of students achiev ements.

 

Keywords: Keywords: MOOC, Open Online Learning, Higher education, Assessment, Peer assessment, Self-assessment, Quiz

 

Share |

Journal Article

Engaging Students in a Peer‑Quizzing Game to Encourage Active Learning and Building a Student‑Generated Question Bank  pp235-247

Nafisul Kiron et al

© Jul 2020 Volume 18 Issue 3, Editor: Lars Elbæk, pp207 - 274

Look inside Download PDF (free)

Abstract

Games are a great source of entertainment and are used by people of all ages; they motivate and engage people and affect their behavior. Therefore, games have been widely studied in many non‑game contexts. Education is one of those areas where gamified, and game‑based learning strategies have been implemented and explored. To engage and motivate students to quiz each other, and as a side effect, build a question bank, as well as to study the gaming and learning behavior of students, we used a peer‑quizzing game called "Tower of Questions" (ToQ). The game uses some themes and mechanics found in tower defense (TD) games. The students received points for posing and answering the questions in the game in the form of gems. Students played the game with pseudonyms for one academic term and were told not to disclose their identities to anyone. We conducted a 3‑month long study for two consecutive years in the same first‑year undergraduate computer science course. In this paper, we present the findings from our studies using ToQ, specifically findings related to the students’ self‑monitoring and quizzing activities based on the game logs and two self‑reported surveys from data collected in the second year of the study.

 

Keywords: gamification, game-based testing, peer-quizzing, incentives, engagement

 

Share |