A platform for crowdsourcing the creation of representative, accurate landcover maps
Environmental Modelling & Software, ISSN: 1364-8152, Vol: 80, Page: 41-53
2016
- 41Citations
- 44Usage
- 87Captures
Metric Options: Counts1 Year3 YearSelecting the 1-year or 3-year option will change the metrics count to percentiles, illustrating how an article or review compares to other articles or reviews within the selected time period in the same journal. Selecting the 1-year option compares the metrics against other articles/reviews that were also published in the same calendar year. Selecting the 3-year option compares the metrics against other articles/reviews that were also published in the same calendar year plus the two years prior.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Metrics Details
- Citations41
- Citation Indexes38
- 38
- CrossRef23
- Policy Citations3
- Policy Citation3
- Usage44
- Downloads44
- Captures87
- Readers87
- 87
Article Description
Accurate landcover maps are fundamental to understanding socio-economic and environmental patterns and processes, but existing datasets contain substantial errors. Crowdsourcing map creation may substantially improve accuracy, particularly for discrete cover types, but the quality and representativeness of crowdsourced data is hard to verify. We present an open-sourced platform, DIYlandcover, that serves representative samples of high resolution imagery to an online job market, where workers delineate individual landcover features of interest. Worker mapping skill is frequently assessed, providing estimates of overall map accuracy and a basis for performance-based payments. A trial of DIYlandcover showed that novice workers delineated South African cropland with 91% accuracy, exceeding the accuracy of current generation global landcover products, while capturing important geometric data. A scaling-up assessment suggests the possibility of developing an Africa-wide vector-based dataset of croplands for $2–3 million within 1.2–3.8 years. DIYlandcover can be readily adapted to map other discrete cover types.
Bibliographic Details
http://www.sciencedirect.com/science/article/pii/S136481521630010X; http://dx.doi.org/10.1016/j.envsoft.2016.01.011; http://www.scopus.com/inward/record.url?partnerID=HzOxMe3b&scp=84959098456&origin=inward; https://linkinghub.elsevier.com/retrieve/pii/S136481521630010X; https://api.elsevier.com/content/article/PII:S136481521630010X?httpAccept=text/xml; https://api.elsevier.com/content/article/PII:S136481521630010X?httpAccept=text/plain; https://dul.usage.elsevier.com/doi/; https://commons.clarku.edu/faculty_geography/73; https://commons.clarku.edu/cgi/viewcontent.cgi?article=1072&context=faculty_geography; https://dx.doi.org/10.1016/j.envsoft.2016.01.011
Elsevier BV
Provide Feedback
Have ideas for a new metric? Would you like to see something else here?Let us know