By Johanna van Megern
The vast development of modern technology holds many positive developments in the field of VGI and DRM. Web 2.0 is an evolution of interactivity between users and web pages that is changing our lives in a way no one imagined only a decade ago. One of the areas influenced by Web 2.0 is Cartography and Mapping in general. Volunteers from thousand miles away can help create a map of roads and houses in hazardous environments. Volunteered Geographic Information (VGI) finds its origins outside the data collection of scientists or geographers. Therefore it is important to establish a type of quality assurance for people who want to use the information, as it needs to be identified if it is fit-for-purpose. The question that sums up the issue is, whether or not it is possible to trust the information provided by VGI? (Goodchild & Li, 2012)
Up until recently, the organization, creation and distribution of geographic information were done by official agencies and the production was an industrial scientific process. The outcome was a highly standardized, expensive product, which was engineered carefully, and fit to be used for a variety of purposes. So stepping away from this process leaves many scientists worried about the quality assurance. Without a doubt, VGI has many advantages. It provides geographic information free of charge, is often more up-to-date than high priced programs, it maybe even more detailed in especially dense neighborhoods and it often contains very specific markings from a specific group of peers (For example: Good look out points for photographers etc.) . However, taking a closer look at VGI shows that it also holds many disadvantages. It is possible to manipulate certain information; anyone, trained or not, can upload or edit information without giving proof of reliable sources. Can we trust or even use geographic information coming from:
· Anonymous sources (with no authority and with no responsibility)
· Contributors with unknown intellectual background (non-professionals in cartography or geography)
· People who do not provide metadata
· People who create spatial information with unknown incentives, and data completeness or even accuracy
· People who do not uphold quality standards
(Flanagin & Metzger, 2008)
The quality of spatial geographic information depends on the credibility of the sources. In order to define and understand credibility, Flanagin and Metzger differentiate two dimensions: the trustworthiness and expertise. A credible source should contain both trust and expertise in combination. Furthermore, they distinguish between two types of credibility - the credibility-as-accuracy and the credibility-as-perception. The first one underlines the accuracy of information and its adequacy to describe and evaluate “scientific knowledge production” (Flanagin & Metzger, 2008). The credibility-as-perception makes use of the notion of believability; many VGI today are based on information, opinion or perspective people believe, rather than scientific data accuracy. This type of credibility is mostly used for social or political purpose and establishes the social and political power of VGI. Therefore, VGI systems need to be understood as socio-technical systems in which the social aspect is as important as the technical component.
Furthermore, a study by L. See focuses on the difference in quality of data between non-expert and experts, finding that the differences are close to not existing. It was even stated that due to population density, some rural areas have more accurate and up-to-date information in OSM than in professional geographic data sets. The Linus’s Law (the more eyes to review, the more accurate the information) can, however, not be applied to areas with scarce population because the quality of information cannot be guaranteed. This emphasizes that VGI is inherently heterogeneous; some areas receive more attention than others. Despite the Linus’s Law or crowd-sourcing approach, Goodchild and Li also propose a social solution which relies on a hierarchy of moderators and gate-keepers; all volunteered information are referred up the hierarchy to ensure their accuracy, consistence as well as geographic standards. Thirdly, he proposes a geographic approach, which uses geographic knowledge to verify that the given information is part of the natural world. This process includes looking at recorded information that might be outdated, and compare it to the VGI to see how much of the information already known appears in the edited information. Those three approaches are likely to be used in combination in order to ensure a higher success rate. (Example: OSM)
Overall, the field of VGI is rapidly evolving and forces the expert producers to rethink their old fashioned approaches of spatial productions. However, the sustainability of user-generated VGI rests on an improved management of contributors and contributions to ensure that the spatial date stays on a high quality standard.
Discussion questions:
- What is the motivation of active contributors? (What is your motivation?)
- Do you have ideas on how VGI and professional geographic agencies can work together in the future?
- What are the most important qualities of geographic information and how can those be assured in VGI?
Sources:
Flanagin, A. J., & Metzger, M. J. (2008). The credibility of volunteered geographic information. GeoJournal, 72(3-4), 137-148. doi:10.1007/s10708-008-9188-y
Goodchild, M. F. & Li, L. (2012). Assuring the quality of volunteered geographic information. Spatial Statistics, 1(null), 110-120. doi:10.1016/j.spasta.2012.03.002
See, L., Comber, A., Salk, C., Fritz, S., van der Velde, M., Perger, C.,… Obersteiner, M. (2013). Comparing the quality of crowdsourced data contribution by expert and non-experts. PloS One, 8(7), e69958. doi:10.1371/journal.pone.0069958
Hi everyone!
Thank you for this great summary of a great topic like I think.
Your text answers the questions from the topline really good so I would like to focus on two of your questions.
1. People working as a volunteer (like we did with OSM data in disaster mapping) want to help. And I believe this is the answer. None would concider doing this volunteer work when he is not intereseted to do it in the right way. People who want to help will help! And even if there is someone who adds false information people will check and then change these.
2. VGI are free information which can be used by every person in the world as well as every agencie. These official agencies already produce a really accurate map yet it could be even more accurate by adding VGI. The only problem I see here would be that the agencies would not want to use these information because they can not assure the quality of their result by just adding the data. They will probably would have to check every single information they get from VGI to confirm the quality of their final product.
Greetings
David
Hello everyone,
I found the text from Goodchild and Li a little bit confusing. They describe – as Johanna has summoned – three concepts that “offer direct improvement in quality” (S. 119). But to me they rather simply describe different arguments why VGI could potentially be trustworthy. But they do not really examine if their concepts are useful and how they could be integrated.
For the crowd sourcing approach they even show that incorrect information is conserved even in densely population regions (S.113f.) – contradicting to Linus’s Law. So here I have to disagree with David’s answer on question one (”people will check and then change these”).
Regarding question two: What is needed is a methodology that autonomously examines the data for a given VGI data set. From what I have read so far, I think we are far away from such procedures. Nowadays research rather focuses on what might determine quality to then verify their approaches with case studies. Maybe in future, when a general agreement on how to define data quality is found, general methodologies are rather designable. But maybe in the course of the seminar I will learn something else ;)
Thank you Johanna - very interesting blog post and good questions.
I think quality assurance is problematic. Of course people want to help and want to add only valid data. Additionally I think it is mainly people who also use the data tend to add data themselves - why else would they contribute to something they don’t regard as necessary or useful themselves? So it is in their own interest to add only valid data. But still deficient data exists (ass seen in the mandatory reading). Hence those errors mainly don’t happen on purpose. Mankind is prone to making errors in every part of life and they can be kept within bounds. The question is, if the errors by VGI are comparable to the errors by officials? And are they comparable anyway? In the example a golf course is mentioned with a beautiful outlook and more than standard-map information. The problem with information is, that it can be subjective and changing over time - especially quickly changing information (e.g. during a disaster). I think it has to be considered where the mapped information is used for and if high precision is necessary for my cause. For example: If I work with a normal GPS mobile phone device (which has internal errors of up to 10m) - is it important that the data I use to navigate has a higher precision?
I’m curious about the discussion this topic will spark tomorrow in the course (sorry for the late comment by the way).