Ce diaporama a bien été signalé.
Nous utilisons votre profil LinkedIn et vos données d’activité pour vous proposer des publicités personnalisées et pertinentes. Vous pouvez changer vos préférences de publicités à tout moment.

Crowdsourcing Land Cover and Land Use Data: Experiences from IIASA

12 vues

Publié le

Quantifying Error in Training Data for Mapping and Monitoring the Earth System - A Workshop on “Quantifying Error in Training Data for Mapping and Monitoring the Earth System” was held on January 8-9, 2019 at Clark University, with support from Omidyar Network’s Property Rights Initiative, now PlaceFund.

Publié dans : Technologie
  • Soyez le premier à commenter

  • Soyez le premier à aimer ceci

Crowdsourcing Land Cover and Land Use Data: Experiences from IIASA

  1. 1. Crowdsourcing land cover and land use data: Experiences from IIASA Workshop on Quantifying Error in Training Data and its Implications for Land Cover Mapping, Clark University, Worcester, MA, USA Presenter: Juan Carlos Laso Bayas Research Scholar Center for Earth Observation and Citizen Science (EOCS) Ecosystems Services and Management (ESM) International Institute for Applied Systems Analysis (IIASA)
  2. 2. • Citizen Science campaigns: Cropland validation, Field Size, Picture Pile • Mobile applications • Directed: Fotoquest • Opportunistic: AgroTutor, GROW • Hybrid: LACO-Wiki mobile • Alerts: FloodCitiSense • Remote sensing: Night lights - Poverty mapping, Oil palm mapping Indonesia Crowdsourcing data collection
  3. 3. Cropland validation campaign: Design 80 volunteers, 144.000 validations, 2000 control locations, 60 expert points 36.000 locations worldwide – 3 weeks
  4. 4. Cropland validation: Geo-Wiki interface Supporting tools, e.g. NDVI Background layers Submission of classification Additional tools, e.g. Google Earth and examples 300 x 300 m grid to classify
  5. 5. Laso Bayas et al., (2017). Nature Scientific Data Mean cropland percentage per location Cropland validation: Control and results • Quality score: Every 20 images randomly one control image • No immediate feedback • Incentives: €750 - €25, scientific publication
  6. 6. Field size campaign: Design 130.000 locations worldwide – 4 weeks – 4000 control points 130 volunteers, 390.000 validations. Initial training sites, feedback.
  7. 7. Field size campaign: Interface Supporting tools, e.g. Measuring tool Background layers Skipping, Google Earth, Examples, Ask experts Additional info, e.g. Image date Three grid colours / sizes Size and dominance selection
  8. 8. Field size campaign: How to tell? • Very small: Fields smaller than the yellow cells (less than 80 x 80 m) • Small: fields of a size between a yellow cell and four yellow cells (2.56 ha); • Medium: fields smaller than the red box (16 ha) but bigger than four yellow cells • Large: fields smaller than the blue box (100 ha) but bigger than the red box • Very large: fields larger than the blue box.
  9. 9. Field size campaign: Results Lesiv M, Laso Bayas JC, See L, et al., (2017). Global Change Biology Dominant field size map
  10. 10. 1. Rapid image assessment 2. Change detection Designed to be generic and flexible tool customizable to different domains that requires EO data as an input resource. Picture Pile: Rapid change mapping Picture Pile
  11. 11. Hurricane Matthew post- disaster damage mapping volunteers validations 179 249K Do you see damaged buildings? Picture Pile: Post disaster mapping
  12. 12. Picture Pile: Control and feedback Control images used to learn, then 1 check every 10 images. Min 4 evaluations per image Experts produce control images
  13. 13. • 250.000 images • Duration: 3 weeks with half of the images classified in 5 days • Damaged = 4 or more volunteers classified with damage • No damage = 4 or more no damage • Likely damaged = 3 • Unknown = no majority • Not usable = 4 or more cloud cover Picture Pile: Damage map
  14. 14. LACO-Wiki: Web and mobile land cover validation platforms 1. Upload 2. Sampling 3. Validation 4. Reporting
  15. 15. LACO-Wiki: ESA CCI 20 m validation Grid step: ~12 km Each location: a 20m x 20m cell Kenya validation done on site using LACO-Wiki online: 55% accuracy Systematic samples for: • Egypt • Cote D’Ivoire • Gabon • Zambia • Kenya
  16. 16. LACO-Wiki: Features being added NDVI Mobile component based on FotoQuest Classifications can be made freely available for use as training data
  17. 17. • Definitions: e.g. cropland/pastures, percentage covered (50%) • Perception of sizes and proportions • Imagery: Cloud coverage, acquisition time, resolution • Competition: Quantity vs Quality – power users • Contributors: Several opinions for one location • Skipping images, not use of tools – increase of speed vs low quality? • Error on control images: Who is an expert? Cropland validation campaign: Potential error sources
  18. 18. • Clearer definitions – general and border line examples • Analysis: Skipped images (nr. of), use of tools, quality scoring • Obtain users profiles (survey) and consistency (performance) • Models that consider weighing users contributions according to performance • Expert points – secret controls? • Potential use of newly available information: e.g. availability of very High Resolution Imagery (spatial and temporal availability) • Training of users: Before and during the campaign • Quality score: penalizing for speeding and immediate feedback Cropland validation campaign: Potential strategies to correct errors
  19. 19. Thank you for your atention Contact: Center for Earth Observation and Citizen Science Juan Carlos Laso Bayas lasobaya@iiasa.ac.at Steffen Fritz fritz@iiasa.ac.at Registrations for the 2019 program are being accepted from 1 Oct 2018 - 11 Jan 2019. The YSSP Program, for PhD students: 3 summer months at IIASA
  20. 20. Center for Earth Observation and Citizen Science at IIASA: EOCS • Explore earth observation (EO) and crowdsourcing (CS) capabilities • New technologies - social innovation • CS and EO for SDGs • Lower costs and extend in-situ data collection • Environmental monitoring by citizens through apps, virtual campaigns and open platforms
  21. 21. Comparison of the crowd with experts Individual Experts Not usable Damage No damage % Not usable 926 184 108 76.0 Damage 79 8602 280 96.0 No damage 163 466 6243 90.9 % 79.3 93.0 94.2 92.5  562 expert locations but seen many times by the volunteers  Overall agreement of 92.5%  Missed damage buildings 7% of the time  Saw damage 4% of the time when experts did not
  22. 22. Campaign from 2017: Match up to 78% with major classes Feedback provided to users in less than 24 hours
  23. 23. LACO-Wiki: Web and mobile land cover validation platforms https://laco-wiki.net
  24. 24. LACO-Wiki: Tools KML files to open in Google Earth desktop application: - Historical imagery - Pictures Different layers NDVI tool linked to GEE

×