Attention visual
2023
- 9,226Usage
Metric Options: CountsSelecting the 1-year or 3-year option will change the metrics count to percentiles, illustrating how an article or review compares to other articles or reviews within the selected time period in the same journal. Selecting the 1-year option compares the metrics against other articles/reviews that were also published in the same calendar year. Selecting the 3-year option compares the metrics against other articles/reviews that were also published in the same calendar year plus the two years prior.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Metrics Details
- Usage9,226
- Abstract Views6,894
- 6,894
- Downloads2,332
- 2,332
Thesis / Dissertation Description
This research presents an innovative approach to improving visual-spatial attention using a research tool based on the web. Recognizing the significant role visual-spatial attention plays in everyday life and cognitive function for humans, this research was undertaken with the aim of developing a user-friendly, accessible web-based tool called Attention Visual (attentionvisual.com) to enhance this crucial cognitive skill. This tool also facilitates data collection, potentially accelerating the pace and enhancing the quality of related research. Both qualitative and quantitative methods were utilized for data collection and analysis. In order to stimulate improvements in visual-spatial attention, the tool’s algorithm was structured to adjust task difficulty according to the user’s performance; heightened performance would yield more challenging tasks, whereas lower performance would result in easier tasks, fostering an adaptive and progressive learning environment. The main hypothesis that underlies this research was that regular use of this tool could result in measurable enhancements in visual-spatial attention. This has potential benefits for various population groups, from athletes to individuals with certain cognitive conditions. The results of the research validate this hypothesis, demonstrating the effectiveness of the webbased tool in enhancing visual-spatial attention and indicating that the design elements of the tool have a positive impact on user performance. The research additionally highlighted a wide range of participant diversity, thanks to the online nature of the tool, enhancing the robustness and generalizability of the data collected. These findings contribute significantly to the fields of cognitive science, neuroplasticity, and digital tool development, offering valuable insights for future research. They demonstrate the effectiveness of web-based tools in cognitive science research and suggest potential avenues for future investigation, such as exploring other aspects of visual cognition or the application of such tools in practical settings like cognitive therapy and rehabilitation.
Bibliographic Details
Provide Feedback
Have ideas for a new metric? Would you like to see something else here?Let us know