lecture: Cheap (and good) data capture for environmental projects
Satellite image archives provide a wealth of valuable historical data that can be used to assess changes in the environment, but extracting high quality information can be costly and time consuming if we restrict the interpretation to experienced image analysts. We attempt to reduce these limitations by crowd sourcing the interpretation process via a web based digitizing system based entirely on open source tools. This approach can lower project costs by eliminating the cost of office space and equipment for the analyst, as well as allowing flexible working hours and locations.
The challenge with this approach is to ensure that the quality of the interpretation remains high. Within the context of a project to model historical iceberg occurrences off the coast of Greenland, this talk will discuss the methods we have implemented for quality control while providing training and feedback to our analysts from an interpretation expert.
The business case for this approach will also be discussed, including the risks and rewards of paying interpreters for each correct feature digitized. In our case we were able to quickly and accurately interpret several hundred images resulting in the measurement of tens of thousands of features. By using cloud based image archives and client/server strategies, this approach can be economically scaled up to much larger projects.
Start time: 11:00
- Plenary Office
- Free, Open and Libre
- Rhine Lobby
- Open Data Revolution of the Deutsche Bahn (German Railways)
- Plenary Chamber
- State of GeoServer
- Fireplace Room
- The OGC's Standards Process and the Role of Reference Implementations
- Room Bonn
- Open Source Photogrammetry with OpenDroneMap
- Room Berlin
- Using SQLite to take maps offline on mobile devices
- Topic Talk - From big to small, and keeping track of it all - what is LocationTech working on?