Digital rubric-based assessment of oral presentation competence with technological resources for preservice teachers

  1. Ana-Belén Pérez-Torregrosa 1
  2. María-Jesús Gallego-Arrufat 2
  3. Manuel Cebrián-de-la-Serna 3
  1. 1 Departamento de Didáctica y Organización Escolar, Facultad de Ciencias de la Educación, Universidad de Málaga
  2. 2 Departamento de Didáctica y Organización Escolar, Facultad de Ciencias de la Educación, Universidad de Granada
  3. 3 Departamento de Didáctica y Organización Escolar, Facultad de Ciencias de la Educación
Zeitschrift:
ESE: Estudios sobre educación.

ISSN: 1578-7001

Datum der Publikation: 2022

Nummer: 43

Seiten: 177-198

Art: Artikel

DOI: 10.15581/004.43.009 DIALNET GOOGLE SCHOLAR lock_openOpen Access editor

Andere Publikationen in: ESE: Estudios sobre educación.

Ziele für nachhaltige Entwicklung

Zusammenfassung

This study focuses on e-assessment of oral presentation competence using technology resources in a model that combines project-based learning and flipped learning. This study uses a digital rubric to assess oral presentation competence in different situations of progressive assessment for 99 preservice teachers, situations in which participation was either optional or compulsory. Findings show that the digital rubric used at various times is a methodology and a technology that facilitates the feedback process and dialogue between teachers and students about the assessment criteria. The results support future decisions for methodological design of formative assessment appropriate to online learning environments.   

Bibliographische Referenzen

  • Biggs, J. B. (2003). Teaching for quality learning at university: What the student does. Open University Press.
  • Bower, M., Cavanagh, M., Moloney, R., and Dao, M. (2011). Developing communication competence using an online video reflection system: Pre-service teachers' experiences. Asia-Pacific Journal of Teacher Education, 39(4), 311-326. https://doi.org/10.1080/1359866X.2011.614685
  • Campbell, K. S., Mothersbaugh, D. L., Brammer, C., and Taylor, T. (2001). Peer versus self assessment of oral business presentation performance. Business Communication Quarterly, 64(3), 23-40. https://doi.org/10.1177/108056990106400303
  • Cebrián-de-la-Serna, M., and Bergman, M. E. (2014). Formative Assessment with eRubrics: an Approach to the State of the Art. REDU. Revista de Docencia Universitaria, 12(1). 15-29. https://doi.org/10.4995/redu.2014.6427
  • Cohen, L. M., Mannion, L. L., and Morrison, K. (2011). Research methods in education. New York.
  • Crawford, A. R., Johnson, E. S., Zheng, Y. Z., and Moylan, L. A. (2020). Developing an understanding procedures observation rubric for mathematics intervention teachers. School Science and Mathematics, 120(3), 153-164. https://doi.org/10.1111/ssm.12393
  • Creswell, J. W., and Creswell, J. D. (2018). Research design: qualitative, quantitative, and mixed methods approaches. SAGE.
  • De Grez, L., Valcke, M., and Roozen, I. (2009a). The impact of an innovative instructional intervention on the acquisition of oral presentation skills in higher education. Computers and Education, 53(1), 112-120. https://doi.org/10.1016/j.compedu.2009.01.005
  • De Grez, L., Valcke, M., and Roozen, I. (2009b). The impact of goal orientation, self-reflection and personal characteristics on the acquisition of oral presentation skills. European Journal of Psychology of Education, 24(3), 293-306. https://doi.org/10.1007/BF03174762
  • De Grez, L., Valcke, M., and Roozen, I. (2012). How effective are self-and peer assessment of oral presentation skills compared with teachers' assessments? Active Learning in Higher Education, 13(2), 129-142. https://doi.org/10.1177/1469787412441284
  • Delgado, M. A., and Fonseca-Mora, M. C. (2010). The use of co-operative work and rubrics to develop competences. Education for Chemical Engineers, 5(3), e33-e39. https://doi.org/10.1016/j.ece.2010.05.002
  • Dunbar, N. E., Brooks, C. F., and Kubicka-Miller, T. (2006). Oral communication skills in higher education: Using a performance-based evaluation rubric to assess communication skills. Innovative Higher Education, 31(2), 115-128. https://doi.org/10.1007/s10755-006-9012-x
  • Falchikov, N., and Goldfinch, J. (2000). Student peer assessment in higher education: A meta-analysis comparing peer and teacher marks. Review of Educational Research, 70(3), 287-322. https://doi.org/10.3102/00346543070003287
  • Fernández-Medina, C., Luque-Guerrero, C., Ruiz-Rey, F., Rivera-Rogel, D., Andrade Vargas, L., and Cebrián-de-la-Serna, M. (2021). Assessment oral competence with digital rubrics for the Ibero-American Knowledge Space. Pixel-Bit. Revista de Medios y Educación, 62, 71-106. https://doi.org/10.12795/pixelbit.83050
  • Galván-Sánchez, I., Verano-Tacoronte, D., González-Betancor, S. M., Fernández- Monroy, M., and Bolívar-Cruz, A. (2017). Assessing oral presentation skills in Electrical Engineering: Developing a valid and reliable rubric. International Journal of Electrical Engineering Education, 54(1), 17-34. https://doi.org/10.1177/0020720916659501
  • Haanstad, E. J. (2020). Collaborative Ethnographic Assessment: An Anthropological Rubric for a Community Ecosystem. Teaching Anthropology, 9(2), 1-8. https://doi.org/10.22582/ta.v9i2.528
  • Houston, D., and Thompson, J. N. (2017). Blending Formative and Summative Assessment in a Capstone Subject:'It's not your tools, it's how you use them'. Journal of University Teaching and Learning Practice, 14(3), 2. https://doi.org/10.53761/1.14.3.2
  • Jönsson, A., and Panadero, E. (2017). The use and design of rubrics to support assessment for learning. In D. Carless, S.M. Bridges, C.K.Y. Chan and R. Glofcheski R (Eds.), Scaling up assessment for learning in higher education (pp. 99-111). Springer. https://doi.org/10.1007/978-981-10-3045-1_7
  • Li, H., Xiong, Y., Zang, X., L. Kornhaber, M., Lyu, Y., Chung, K. S., and K. Suen, H. (2016). Peer assessment in the digital age: A meta-analysis comparing peer and teacher ratings. Assessment and Evaluation in Higher Education, 41(2), 245- 264. https://doi.org/10.1080/02602938.2014.999746
  • Madruga, T., Thiam, S., Vasquez, A., Kothari, R., and Krauss, G. G. (2020). Reviewer perspective impact on design review feedback. The International Journal of Engineering Education, 36(2), 675-686.
  • Magin, D., and Helmore, P. (2001). Peer and teacher assessments of oral presentation skills: how reliable are they? Studies in Higher Education, 26(3), 287-298. https://doi.org/10.1080/03075070120076264
  • Menéndez-Varela, J. L., and Gregori-Giralt, E. (2016). The contribution of rubrics to the validity of performance assessment: a study of the conservationrestoration and design undergraduate degrees. Assessment and Evaluation in Higher Education, 41(2), 228-244. https://doi.org/10.1080/02602938.2014.998169
  • Meyers, N. M., and Nulty, D. D. (2009). How to use (five) curriculum design principles to align authentic learning environments, assessment, students' approaches to thinking and learning outcomes. Assessment and Evaluation in Higher Education, 34(5), 565-577. https://doi.org/10.1080/02602930802226502
  • Morreale, S. P., Valenzano, J. M., and Bauer, J. A. (2017). Why communication education is important: A third study on the centrality of the discipline's content and pedagogy. Communication Education, 66(4), 402-422. https://doi.org/10.1080/03634523.2016.1265136
  • Montalvão, D., and Baker, T. (2015). Correlating peer and tutor assessment on a lowstakes engineering assignment. International Journal of Mechanical Engineering Education, 43(3), 168-179. https://doi.org/10.1177/0306419015603008
  • Mulder, M. (2014). Conceptions of professional competence. In S. Billett, C. Harteis, and H. Gruber (Eds.), International handbook of research in professional and practice-based learning (pp. 107-137). Springer. https://doi.org/10.1007/978-94-017-8902-8_5
  • Nordrum, L., Evans, K., and Gustafsson, M. (2013). Comparing student learning experiences of in-text commentary and rubric-articulated feedback: strategies for formative assessment. Assessment and Evaluation in Higher Education, 38(8), 919-940. https://doi.org/10.1080/02602938.2012.758229
  • Pathak, A., and Le Vasan, M. (2015). Developing Oral Presentation Competence in Professional Contexts: A Design-Based Collaborative Approach. International Journal of Evaluation and Research in Education, 4(4), 179-184. https://doi.org/10.11591/ijere.v4i4.4509
  • Redecker, C. (2017). European framework for the digital competence of educators: Dig- CompEdu. Report for the European Union no. JRC107466. Joint Research Centre.
  • Romero-García, C., Buzón-García, O., Sacristán-San-Cristóbal, M., and Navarro- Asencio, E. (2020). Evaluation of a Program for the Improvement of Learning and Digital Competence in Future Teachers Utilizing Active Methodologies. Estudios sobre Educación, 39, 179-205. https://doi.org/10.15581/004.39.179-205
  • Skovholt, K., Nordenström, E., and Stokoe, E. (2019). Evaluative conduct in teacher- student supervision: When students assess their own performance. Linguistics and Education, 50, 46-55. https://doi.org/10.1016/j.linged.2019.03.001
  • van Ginkel, S., Gulikers, J., Biemans, H., and Mulder, M. (2015). Towards a set of design principles for developing oral presentation competence: A synthesis of research in higher education. Educational Research Review, 14, 62-80. https://doi.org/10.1016/j.edurev.2015.02.002
  • van Ginkel, S., Gulikers, J., Biemans, H., and Mulder, M. (2017). The impact of the feedback source on developing oral presentation competence. Studies in Higher Education, 42(9), 1671-1685. https://doi.org/10.1080/03075079.2015.1117064
  • van Ginkel, S., Laurentzen, R., Mulder, M., Mononen, A., Kyttä, J., and Kortelainen, M. J. (2017). Assessing oral presentation performance: Designing a rubric and testing its validity with an expert group. Journal of Applied Research in Higher Education, 9(3), 474-486. https://doi.org/10.1108/JARHE-02-2016-0012
  • Zheng, L., Chen, N. S., Cui, P., and Zhang, X. (2019). A Systematic Review of Technology-Supported Peer Assessment Research: An Activity Theory Approach. International Review of Research in Open and Distributed Learning, 20(5), 168-191. https://doi.org/10.19173/irrodl.v20i5.4333