I’m pretty new to the Data Transformation Programme here at Defra, and reflecting on the last few months has made me think a lot about what data means for our past and our future.
I actually used to work on transparency policy way back in 2014, and in that post-Open Data White Paper world, the focus was all about persuading (and often mandating - with big sticks) colleagues to release information and data that we, as a government, held out into the public domain. The driver for this was a push from the top: that this would better-enable the public to hold us to account, and in so doing, improve public services and realise efficiencies.
So, your civically minded bus driver in Leeds could easily take a look at his local crime/hospital/education statistics, compare them against other areas, and write off to his/her MP (or whoever) to give them a piece of their mind as to why performance in their area was so under-par. Wheels would be set in motion, phone calls would be made, and performance would (the theory went) naturally improve.
Some really cool products came out of this era. The Police crime stats is a great example of how fairly rudimentary data (crime type, date committed, postcode etc.) could easily be combined into a visualisation product that can help to inform citizens about topics of local interest. Slowly, change started to happen, and those who before had been reluctant to release information - lest a mistake might end up on the front page of a newspaper - started to realise the value that openness and transparency could have.
Coming back into the data field recently, it’s been fantastic to see how far we have come since then, and just how radically the landscape has changed. At Defra, we hold such vast amounts of data that touch on so many areas of our lives, that I’ve come to be in awe of the almost unlimited potential that this data has to improve our very existence. We just need to find better ways of managing that data, more innovative mechanisms for combining, manipulating and interrogating it, and more user-friendly ways for disseminating it. I say ‘just’ – it’s a pretty monumental task – but momentum grows stronger all the time.
I often use the Environment Agency’s excellent Bathing Water Quality indicators to check out my likelihood of picking up some nasties when I go for a swim. It’s a great bit of work that is super-useful, and thinking of all the links in the chain needed to make that happen is immense: from your local field operative collecting the samples; someone in a white coat somewhere testing them; someone recording the data on a spreadsheet (I guess); someone building the software to display the information etc.
But the possibilities when we can combine and use this base data in new ways are really inspirational. I saw a tool recently that combined this water quality data, along with estimated visitor numbers, proximity to built-up areas and a whole bunch of other stuff to start to place a value on the beaches and recreational areas around our country. And in so doing we can better protect and develop the resources we have at our disposal.
And Greater Manchester the Open Data Infrastructure Map has had developers combining and layering a series of datasets, including transport links; utility supplies; local schools; flood risk etc. to provide a user-friendly tool for developers to identify brownfield sites most suitable for construction, local government planners to assess applications, and for citizens to hold them to account by being able to scrutinise development.
All super-innovative work that ultimately enhances the lives of our citizens. And perhaps a million miles away from what was generally envisaged when government started to release data in a bid to be more transparent.
The photo used for the cover image of this post is of an anthropomorphoid figure in Alice Holt Forest, part of Forestry Commission Estate