Studying Reflection as a Self-Assessment Tool for Online Learning

Ya Mo, Ph.D.

Introduction

Objectives

Assessment in online teaching is challenging due to the nature of instructional mode and the assessment complexities. It is hard to assess students’ engagement levels with online materials where learning happens when students can internalize the newly acquired knowledge (Darling-Hammond, et. al, 2019). Recognizing such challenges caused by the nature of online education, this study examined the effectiveness of using reflection on a combination of selected-response and constructed-response questions as a self-assessment tool in an online learning environment. Students’ perceptions of the usefulness of other online assessment strategies such as asynchronous discussions and hands-on projects were also explored.

Review of Literature

Assessment Challenges in Online Learning. The assessment challenges in online learning arise because of the physical distance between the instructor and the students. Because of the ongoing need to collect a variety of assessment data and provide feedback, the physical distance increases the instructors’ workload, often resulting in time management issues (Kearns, 2012). Tomei (2006) found that online teaching demanded a minimum of 14% more time than traditional instruction. The physical distance makes the adaptations necessary to use technology to communicate with students; instructors in online learning face the challenge of providing timely and personalized feedback to help learners achieve the learning goals (Bloxham & Boyd, 2007; Gibbs & Simpson, 2005). Informal assessments, such as observational and participatory assessments, are difficult to conduct without face-to-face interactions (Oncu & Cakir, 2011). Beebe, Vonderwell, and Boboc (2010) summarized five areas of particular concerns: (1) time management, (2) student responsibility and initiative, (3) the structure of the online medium, (4) the complexity of content, and (5) informal assessment. Related to that, Kearns (2012) additionally identified other issues in assessment for online learning from existing literature: (1) the importance of authentic assessment activities, (2) the use of assessment that encourages academic self-regulation, (3) concerns about academic integrity, and (4) the challenges of assessing online discussion and collaboration.

Assessment Methods in Online Learning. Swan (2001) found that discussion, papers, other written assignments, quizzes and tests, projects, and group work were commonly used as assessment methods in online learning. Arend (2007) and Kearns (2012) identified additional assessment methods, including experimental assignments, problem assignments, journals, presentations and fieldwork. Online discussion, quizzes and tests, and written assignments were consistently the most commonly used assessment methods in online education.

Effective assessment practices in online learning. Gaylan and McEwen (2007) identified effective assessment practices in online learning, including projects, portfolios, self-assessments, peer evaluations with feedback, timed tests and quizzes, and asynchronous discussion. Relevant research has also found that online discussion (Davies & Graff, 2005; Vonderwell, et. al, 2007), immediate elaborated feedback (Tsai, et. al, 2015), use of performance and portfolio assessment (Reeves, 2000), the self, peer (in particular, oriented peer assessment strategies), and group assessments (Keppell, et. al, 2006; Roberts, 2006; Tseng & Tsai, 2010), as well as reflection (Chen, et. al, 2009; Kayler & Weller, 2007) are among the most effective assessment practices in online learning.

Although a variety of online learning assessment methods have been developed and utilized, none or few studies investigated the effects of self-reflection on a learner’s performance on both selected-response test and constructed-response test. This study intended to build upon the principles of reflective assessments in the hopes of identifying additional roles of online metacognition skills for online academic performances.

Method

This study was exploratory with a focus on capturing and describing the role of reflection on online assessments. Framed as qualitative (Creswell, 1998; Bogdan & Biklen, 1992), this study utilized a case study approach that focused on capturing and interpreting participants’ experiences of reflection while completing different types of assessment provided in an online graduate-level assessment and evaluation course. Yin (1989) states that the findings from case studies are perhaps not generalizable, but that those findings allow researchers to explore and theorize relationships that may otherwise remain undiscovered.

Participants

Five students in a graduate-level assessment course in a Pacific Northwest public university participated in the study. At the time of conducting the present study, all participants were full-time employees, among which three of them worked as middle school and high school teachers in math, English, and religious studies whereas two students worked in other fields hoping to become a teacher or social worker.

Instructional Activities

In each module of an online course, students completed a selected-response test and a constructed-response test. After both tests, students were provided with the correct answers and detailed feedback for the selected-response test; students were then asked to write a reflection based on a set of reflective questions regarding their learning. Students’ grades on the selected-response test would be a composite score from both the accuracy of their responses and the completeness of addressing the reflection questions.

For the constructed-response test, students were provided with an illustrative response and asked to judge their answer on a five-point scale ranging from “inappropriate” to “completely appropriate” and give a 1-2 sentence rationale for their choice. The instructor either agreed or disagreed with the students’ self-assessment grade. In addition to the selected-response test and the constructed-response test, students were expected to post on the online discussion board in response to a prompt and replies to two classmates’ posts. The same set of assessments with different content repeated across 15 modules of the course. The final project was to create an assessment portfolio that consisted of (1) a selected-response test, (2) a constructed-response test, (3) a performance assessment, (4) a portfolio assessment, and (5) an affective assessment; students also revised those tests based on feedback from teachers and peers.

Data Sources

Reflection Questions. Five constructed-response questions were included in each module of Module 01-11 and a set of selected-response questions were included in Module 12-16 to direct students to reflect upon their challenges and learning (please see Appendix A).

Questionnaires. The effectiveness of using reflection as a self-assessment tool was evaluated using surveys of students’ perceptions. Three questionnaires were administered to students during the course to get students’ feedback across three main sections of the course content.

Results

Selected-Response Test and Reflection

Out of the five students, four students strongly agreed that the automatic feedback for the selected-response test was helpful to deepen their understanding of the course content, and one student somewhat agreed with the statement. Two students strongly agreed that the reflection on the selected-response test helps engage them with automatic feedback and promote their learning on the content; three students somewhat agreed with the statement. When asked which type of reflection questions were more helpful for their learning, students noted that they had to think more and learned more from the constructed-response reflection questions. With the selected-response reflection questions, a student commented, “I honestly just marked answers on it to finish quickly.” However, with the constructed-response reflection questions, a student remarked, “I didn’t always have something to say for each question…I felt obligated to come up with some sort of answer that didn’t always feel genuine.” It appeared that the constructed-response reflection questions challenged students more but may not always line up with students’ thinking.

To improve the constructed-response reflection questions, one student suggested that a more simplified form with only two questions—one about how their understanding was extended and one about any questions that were confusing or “just choose three to respond to and perhaps one more open-ended question that allows you to discuss anything else without a question stem.” Another student suggested adding a little bit to the prompt to allow students a different option.

Students all liked the grading practice of using a composite score from both the accuracy of their responses on the selected-response test and the completeness of addressing the reflection questions as their grade for the selected-response test. it encouraged careful reflection and allowed them to ask questions and receive feedback from the teacher to clear up their understanding. As one student commented, “it worked well as an assessment as learning.”

Constructed-Response Test and Self-Assessment

Out of five students, four students strongly agreed that the illustrative response for the constructed-response test provided enough information to conduct self-assessment; one student somewhat agreed with the statement. Three students strongly agreed that the self-assessment was helpful for them to reflect on their response and promote their learning of the content; one student somewhat agreed with the statement; one student neither agreed nor disagreed with the statement and thoughtfully suggested that “perhaps after having the student self-assess their response they could write what they would have added or changed about their response…”

Other Online Assessment Practices

Unanimously, all students have chosen the portfolio project as the most helpful activity. They liked the portfolio project because it made them think a lot, gave them lots of hands-on experience with both the conceptual ideas and the physical act of assembling meaningful assessments, and was immediately applicable in their classrooms. As a student commented, “the portfolio was what changed this class from a passive learning situation into a professional development opportunity that will have a big impact on my teaching.”

Students perceived the following components of the portfolio project most helpful to their learning: examples of completed products, feedback from teachers and peers, and an opportunity to revise their assessment products. A student observed, “I think the examples you gave us were a tremendous help to getting us started on some of them and also set a high bar for us. Seeing the finished examples were a great way to visually show the concepts of each

chapter.” Students appreciated the feedback to help them improve and found the feedback encouraging and increased their understanding of the content. Furthermore, the opportunity to revise allowed them to incorporate the feedback to improve their assessments.

Surprisingly, two students out of five expressed that a common formative assessment strategy—asynchronous discussion—was the least helpful activity. One student disliked it because it was not a preferred learning style and remarked, “I’m someone who likes to learn by themselves… I felt like I didn’t learn much from people’s replies nor answering the question and posting it.” Another disliked it because of the affective aspect of the activity “I have a lot of anxiety about contributing to discussion boards because I am not sure about cultural norms and I struggle to communicate my opinions in an appropriate way … The instructions provided sentence frames which were helpful. However, the affective aspects of this type of activity are always problematic for me…”

Conclusions

Though students have observed that a combination of traditional tests and reflection makes an assessment as learning, the actual form of the reflection questions needs to be more carefully thought out. The reflection questions should encourage students to think more but also allow students options to answer in a way aligned to their thinking. When students are asked to self-assess their answers with a provided exemplary response, students can highlight how they would improve their answers. Students’ unanimous liking of the hands-on projects that apply to their real-world challenges reiterates the importance of authentic assessment activities (Kearns, 2012). Some students’ frustration with asynchronous discussions reminds us that we have a diverse body of learners, especially in online learning. Multiple and flexible engaging activities in a variety of modalities should be provided to support the learning outcomes.

This study provided insights into using reflection as a self-assessment tool to enhance students’ knowledge and skills on the learning outcomes. The future effort will include refining and further developing reflection questions and apply the self-reflection method on different types of assessments, such as portfolio and project-based learning, as well as in different learning contexts, such as social science or medical education. Using reflection as a self-assessment tool will eventually be incorporated into a multi-level assessment protocol that consists of a variety of effective assessment practices for improving students’ metacognitive and learning skills.

 

Appendix A

Constructed-Response Reflection Questions

Description

Rationale: The reflective questions were designed to help you think about which concepts were easy for you, what you learned, and how they were connected to other concepts that you had learned.  The questions were also designed to help you identify which concepts were difficult or challenging for you and what made them difficult. Also, the questions were created to provide you a learning opportunity to study instant feedback. Usually, when we get an answer correct, the feedback will confirm our thinking; when we get an answer wrong, the feedback will extend or challenge our thinking. However, we may obtain an answer correct with uncertainty (i.e., guessing), so the instant feedback may extend or challenge our thinking. We may get an answer wrong though the feedback may confirm certain aspects of our understanding that are accurate. Thus, the questions are not arranged in a particular order.

Instructions

There is no word limit (i.e., no minimum and no maximum) for your answers to the reflective questions. Please answer the questions for yourself in a way that best helps you summarize and internalize the acquired knowledge. Each reflection question is worth 1 point. The total score is 5 points. Your answers will be assessed on the completeness, not mastery.

Example

I am pleased with my performance on items e.g., 1st, 2nd, 3rd, because e.g., the concepts of … were easy for me as they were connected to previous concepts of … that I learned/experiences that I had …

  1. I am pleased with my performance on items ___________, because _____________________.
  2. I had difficulty with items ___________, because ___________________________________.
  3. The instant feedback confirmed my understanding of concepts in the following ways (Please provide at least one concrete example to support your statement): _________________________
  4. The instant feedback extended my understanding of concepts in the following ways (Please provide at least one concrete example to support your statement): _________________________
  5. The instant feedback challenged my understanding of concepts in the following ways (Please provide at least one concrete example to support your statement): _________________________

Selected-Response Reflection Questions

Description

The reflective questions were designed to help you think about which concepts were easy, difficult, or challenging for you. Also, the questions were created to provide you a learning opportunity to study instant feedback and think about whether the feedback confirms, extends, or challenges your understanding.

Instructions

Please review the instant feedback for the Selected-Response Check of Mastery (Part A). For the Selected-Response Check of Mastery (Part B), please choose the option that best matches your reflection on each question in Part A. If you have any questions/comments, please write them in the optional question section.

Tip: To make your reflection easier, you can split your computer screen to allow Part A result with instant feedback and Part B to be shown side by side on your screen.

  1. Item 1 was:
    1. easy & the feedback confirmed my understanding.
    2. easy & the feedback extended/challenged my understanding.
    3. difficult & the feedback helped me understand why now.
    4. difficult & I still have questions/comments after reading the feedback.
  2. Item 2 was:
    1. easy & the feedback confirmed my understanding.
    2. easy & the feedback extended/challenged my understanding.
    3. difficult & the feedback helped me understand why now.
    4. difficult & I still have questions/comments after reading the feedback.

(Optional) If you have questions/comments with any of the items, please write your questions/comments below.

References

Anderson, L. W. & Krathwohl, D. R. (2009). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. Longman.

Arend, B. (2007). Course assessment practices and student learning strategies in online     courses. Journal of Asynchronous Learning Networks, 11(4), 3-13.

Beebe, R., Vonderwell, S., & Boboc, M. (2010). Emerging patterns in transferring assessment practices from F2F to online environments. Electronic Journal of e-Learning, 8(1), 1 -12.

Bloom, B. S. (1956). Taxonomy of educational objectives. Vol. 1: Cognitive domain. New York: McKay, 20-24.

Bloxham, S., & Boyd, P. (2007). Developing effective assessment in higher education: A practical guide. Maidenhead, UK: Open University Press.

Bogdan, R., & Biklen, S. K. (1992). Qualitative research for education (2nd ed.). Needham Heights, MA: Allyn and Bacon.

Chen, N. Wei, C., Wu, K., Uden, L. (2009). Effects of high-level prompts and peer assessment on online learners’ reflection levels. Computers & Education, 52, 283-291.

Creswell, J. (1998). Qualitative inquiry and research design: Choosing among five traditions. London, England: Sage.

Davies, J. and Graff, M. (2005). Performance in e-learning: online participation and student grades. British Journal of Educational Technology, 36, 657-663.

Gaytan, J., & McEwen, B. C. (2007). Effective online instructional and assessment strategies. The American Journal of Distance Education, 21(3), 117-132.

Gibbs, G., & Simpson, C. (2005). Conditions under which assessment supports students’ learning. Learning and Teaching in Higher Education, 1, 3-31.

Kayler, M. & Weller, K. (2007). Pedagogy, Self-Assessment, and Online Discussion Groups. Journal of Educational Technology & Society, 10(1), 136-147.

Kearns, L. (2012). Student assessment in online learning: Challenges and effective practices. Journal of Online Learning and Teaching, 8(3), 198-208.

Keppell,M., Au, E., Ma, A. & Chan, C. (2006). Peer learning and learning-oriented assessment in technology-enhanced environments. Assessment & Evaluation in Higher Education, 31(4), 453-464.

Oncu, S., & Cakir, H. (2011). Research in online learning environments: Priorities and methodologies. Computers & Education, 57(1), 1098-1108.

Roberts, T. S. (Ed.). (2006). Self, peer and group assessment in e-learning. IGI Global.

Reeves, T. C. (2000). Alternative Assessment Approaches for Online Learning Environments in Higher Education. Journal of Educational Computing Research, 23(1), 101–111.

Swan, K. (2001). Virtual interaction: Design factors affecting student satisfaction and perceived learning in asynchronous online courses. Distance Education, 22(2), 306-331.

Tomei, L. (2006). The impact of online teaching on faculty load: Computing the ideal class size for online courses. Journal of Technology and Teacher Education, 14(3), 531-541.

Tsai, F., Tsai, C., Lin, K. (2015). The evaluation of different gaming modes and feedback types on game-based formative assessment in an online learning environment. Computers & Education, 81, 259-269

Tseng, S.-C. & Tsai, C. C. (2010). Taiwan college students’ self-efficacy and motivation of learning in online peer assessment environments. The Internet and Higher Education 13(3), 164-169.

Vonderwell, S. Liang, X. & Alderman, K. (2007) Asynchronous Discussions and Assessment in Online Learning. Journal of Research on Technology in Education, 39(3), 309-328.

Webb, N. L. (2002). Alignment study in Language Arts, Mathematics, Science, and Social Studies of state standards and assessment for four states. Washington, DC: Council of Chief State School Officers.

Yin, R. K. (1989). Case study research: Design and methods. Newbury Park, CA: Sage.

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

Emerging Research in Online Learning Copyright © by Ya Mo, Ph.D. is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.

Share This Book