Analysis of assessments on secondary students' development and interpretation of models
2021
- 230Usage
Metric Options: CountsSelecting the 1-year or 3-year option will change the metrics count to percentiles, illustrating how an article or review compares to other articles or reviews within the selected time period in the same journal. Selecting the 1-year option compares the metrics against other articles/reviews that were also published in the same calendar year. Selecting the 3-year option compares the metrics against other articles/reviews that were also published in the same calendar year plus the two years prior.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Metrics Details
- Usage230
- Downloads137
- Abstract Views93
Artifact Description
As districts are making the shift to three-dimensional learning the development of a coherent set of high-quality task-based assessments has been a challenge. For this research I collected and analyzed twelve of my district's assessments over the scientific skill of modeling from grades seven through 12. The analysis involved two tools developed by NGSS and Achieve.org to determine the extent to which the assessments ask students to perform tasks that are driven by phenomena and use the three-dimensional in service of sense-making, the Task Screener and the Framework to Evaluate Cognitive Complexity in Science Assessments. The findings support what researchers have said about the shift to three-dimensional task-based assessments: Choosing appropriate engaging phenomena is key to developing high-quality rigorous assessments. While most of my district’s modeling assessments were found to be three-dimensional they are not rigorous because the phenomena guiding the tasks are too general and not puzzling.
Bibliographic Details
Provide Feedback
Have ideas for a new metric? Would you like to see something else here?Let us know