Why we’re failing to get the most out of open data

An unprecedented number of individuals and organizations are finding ways to explore, interpret and use Open Data. Public agencies are hosting Open Data events such as meetups, hackathons and data dives. The potential of these initiatives is great, including support for economic development (McKinsey, 2013), anti-corruption (European Public Sector Information Platform, 2014) and accountability (Open Government Partnership, 2012). But is Open Data's full potential being realized?

A news item from Computer Weekly casts doubt. A recent report notes that, in the United Kingdom, poor data quality is hindering the government's Open Data program. The report goes on to explain that ñ in an effort to make the public sector more transparent and accountable ñ UK public bodies have been publishing spending records every month since November 2010. The authors of the report, who conducted an analysis of 50 spending-related data releases by the Cabinet Office since May 2010, found that that the data was of such poor quality that using it would require advanced computer skills.

Far from being a one-off problem, research suggests that this issue is ubiquitous and endemic. Some estimates indicate that as much as 80 percent of the time and cost of an analytics project is attributable to the need to clean up 'dirty data' (Dasu and Johnson, 2003). Continue>>>
======