Cultivating a questioning mind: Student-led question composition in large courses
2023
- 45Usage
Metric Options: CountsSelecting the 1-year or 3-year option will change the metrics count to percentiles, illustrating how an article or review compares to other articles or reviews within the selected time period in the same journal. Selecting the 1-year option compares the metrics against other articles/reviews that were also published in the same calendar year. Selecting the 3-year option compares the metrics against other articles/reviews that were also published in the same calendar year plus the two years prior.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Metrics Details
- Usage45
- Abstract Views45
Artifact Description
Asking a good question is not a trivial task. It requires deep comprehension and concept integration. To facilitate critical thinking and mastering of foundational concepts in a large Genetics course (~1200 students) at the second-year undergraduate level, we decided to actively engage students in question creation. We used “Quizzical”, an online platform developed by Prof. Dan Riggs (Riggs et al., 2020, https://doi.org/10.1187/cbe.19-09-0189). Via this platform, students are tasked with the creation of multiple-choice questions. For each of the suggested answer choices, students are required to provide a comprehensive justification. This includes justification for the correct answer as well as for each of the distractors. An added advantage of the platform is the generation of student-authored quiz banks that can be used for practice and participation marks. Since the questions are created by multiple authors, they included diverse point of views, which we learned the students greatly appreciated. To foster metacognition and encourage a shift from perceiving learning as memorization of information, students were encouraged to create application-based questions. Higher grades were granted to questions that creatively integrated multiple concepts or required knowledge application.In order to inform our teaching practices, pilot studies were conducted in Fall 2021 and Summer 2022, where students were asked to complete an anonymous survey regarding their experiences with Quizzical, and the feedback that we received was positive overall. We will discuss the learning outcomes achieved by engaging the students in question creation, and will share quantitative and formative feedback received from our students.This research was approved by our institutional research ethics board.
Bibliographic Details
Provide Feedback
Have ideas for a new metric? Would you like to see something else here?Let us know