Sustainability and the climate change emergency

Notes from an event at the Royal Geographical Society, 9 October 2019. Using data to build public and decision-maker awareness of climate change. (My sense was actually that the event showed that stories are more powerful than data in getting people to care about this kind of thing)

Sophie Adams, Ofgem

Ofgem is working to decarbonise the energy system.

They’ve been working to make their data machine readable. They’ll then publish it on their data hub, through the Energy Data Exchange.

They’re taking in information from the Met Office and matching it up with price changes over time, to see the impact that weather has on energy prices.

17-18 Jan 2020 – Ofgem and valtech will co-host a hackathon on visualising environmental data. It’ll ask questions like “How could we decarbonise the UK in 5 years?”

Jo Judge, National Biodiversity Network

They get data in lots of different formats. Converting this into something consistent and usable is a challenge. Encouraging people to use this biodiversity data also takes work. Their State of Nature report visualises and summarises some of this data.

Philip Taylor, Open Seas

Mapping cod volumes and fishing locations over time, using publicly-available data, provokes conversations about management about this resource. (Of course I disagree with this conception of these creatures as a resource.)

Open Seas tries to take data and turn it into public awareness and better decision-making. They also use data to spot illegal fishing.

Using boat beacon data along with geospatial data on protected areas to spot boats that have fished illegally.

Chris Jarvis, Environment Agency

The Environment Agency use data to create UK Climate Projections, looking at the impact that change will have on weather. They’re working on linked data to allow their datasets to be built up in useful ways.

We used to think about flood defence. That’s not viable any more – we now think about resilience. The Environment Agency want to build a “nation of climate change champions” – people who know what’s happening, the risks and impact on them and what they can do.

2/3 of the 5m million people whose homes are at risk of flooding are unaware.

The Environment Agency are great at flood forecasting. Data collected up to every 15 minutes. They collect this over time, and make this over time.

This data is available through API. There’s information on hydrology and flood monitoring, including flood areas.

Aside: The Met Office’s data might also be interesting and useful. Information on their products and hourly site-specific observations.

Ben Rewis, Save the Waves

Dirty Wave challenge – crowdsourced data on dirty beaches, with an incentive to take action.

Users take a photo, tag it to their geolocation, and classify the type of problem that it relates to.

Advice for January hackathon: convene a set of people with shared values. Use technology to add value in some way. Get standards to encourage reuse and interoperability. Connect shared communities to a bigger picture. You might either get people to passively add data, or to interrogate, curate and work with what is already there.

Build a food bank API – part 1

I’m going to try and build an API that tells you the items needed by nearby foodbanks.

An API is a tool that lets you quickly interface between bits of technology. If a tool has an API, it means that web developers can quickly interact with it: either taking information out of it or sending information or instructions to it. Using an API to interact with a bit of software is generally quick and easy, meaning that you can spend your time and energy working on doing something special and interesting with the interaction, rather than spending your effort working out how to get the things to talk to each other in the first place. Twitter has an API which lets you search, view or post tweets; Google Maps has an API that lets you build maps into your website or software. I built a tool around the twitter API a few years ago and found it a real thrill.

The idea for this API came from Steve Messer. I haven’t worked on a creative/web development project for about a year, and I’ve been feeling eager to take one on. I know that I learn a lot working on a personal project. I also experience a fantastic sense of flow.

Inspired by the Weeknotes movement, I’m going to write a series of blog posts about how I get on.

Goal for the project

Make an API that, for a given geolocation, returns the nearest 3 foodbanks, with a list of the items that they need.

How I’m approaching the work

I’m going to focus on the Trussell Trust, as they have a large national network of foodbanks – whose websites seem to work in the same way.

I’m starting by testing some risky assumptions. If these assumptions turn out to be wrong, I might not be able to meet my goal. So I want to test them as soon as I can.

Currently known risky assumptions

  • If I know the URL of a given foodbank’s page on food donations, I can work out what items they need.
  • All Trussell Trust foodbanks follow the same way of organising their websites
  • All Trussell Trust foodbanks follow the same way of describing the items they need.
  • I can access or somehow generate a comprehensive and accurate list of all Trussell Trust foodbanks
  • If I have a list of Trussell Trust foodbanks I can straightforwardly work out the URLs of their pages describing the items they need
  • I can scrape the information I need from the relevant server/servers in a courteous way
  • It won’t be very difficult to build a data representation of food banks and required items, or to store this in an appropriate database.
  • Building and running the API won’t be too much fuss. (Or, less concisely: It’s possible to build a lightweight, modern infrasturcture to host a database for this API and serve requests without too much complexity or cost.)

Side challenge

Can I host this API in a way that is carbon neutral or, even better, renewably-hosted?

If I can’t, can I at least work out how much it’s polluting and offset it somehow?

What next

I’m going to start by working on the first risky assumption – “If I know the URL of a given foodbank’s page on food donations, I can work out what items they need.”

Read part 2 of this project to find out what I did next.

Audio Experience Design

Dr Lorenzo Picinali, Senior Lecturer in Audio Experience Design at Imperial College London, visited GOV.UK to talk about his work. He works on acoustic virtual and augmented reality. He’s recently worked on 3D binaural sound rendering, spatial hearing, interactive applications for visually impaired people, hearing aids technologies, audio and haptic interaction.

Vision contains much more information than sound. If there’s audio and visual input, our brains generally prioritise the visual.

e.g. the McGurk illusion: visual input shapes our understanding of sound.

https://www.youtube.com/watch?v=G-lN8vWm3m0

Echo location. This blind man gets information on size, layout, texture and density by making a clicking noise and listening to the echoes. He trained his brain to better localise echoes.

(To learn more, check out this episode of the Invisibilia podcast)

In some contexts sound is better than vision:

  • It’s 360 degrees. You don’t have to be looking at it.
  • It’s always active. e.g. good for alarms.
  • Occlusions don’t make objects inaudible. (You can often hear things even if there’s another object in the way, whereas line of sight is generally blocked by other objects.)
  • Our brain is really good at comparing sound signals
  • We’re better at memorising tonal sequences than visual sequences.

Examples of good interfaces that use sound:

  • Sound can be useful to give people information in busy situations. e.g. a beeping noise to help you reverse park.
  • Music to help pilots fly level at night. With this interface, the left or right volume would change if the plane was tilting, and the pitch would go up or down if the plane was pointing up or down. This worked really well.
  • A drill for use in space. Artificial sound communicated speed and torque.

Acoustic augmented reality is a frontier that hasn’t been explored yet. We can match the real world and the virtual world more convincingly than with visual elements of augmented reality, where it’s quite clear that they aren’t real.

Our ears are good at making sense of differences in volume and the time that sound reaches them. This lets us work out where in space sounds are coming from. Our binaural audio processing skills mean that we can create artificial 3d soundscapes.

https://imperialcollegelondon.app.box.com/s/3ki1gg770nmzhykhvnqzzdd7ipx6ioem

Plugsonic – a platform that lets you create 3d soundscapes on the web using your own sound file and pictures.

Open standards- a cross government technical architecture workshop

My notes from a cross-government Technical Architecture community workshop on 29 July, hosted at Government Digital Service.

Open Standards are publicly-available agreements on how technology will work to solve a particular problem, or meet a particular need.

The Open Data Institute has a useful definition of standards, open standards and open data.

Open Standards are good for laying the foundations for cooperation in technology, as they allow people to work in a consistent way. e.g. HTML is an open standard, which means that everyone can build and access web pages in the same way.

As technology develops, the standards can be updated, allowing innovation in a way that retains the benefits of interoperability.

How GDS works with Open Standards – Dr Ravinder Singh, Head of Open Standards, Government Digital Service

GDS outlines the Open Standards it supports. You can suggest standards that should exist. You’ll be asked 47 assessment questions. If a proposal comes out of that, GDS will take this to the Open Standards Board, which meets twice a year. The new Open Standard will be published on GOV.UK if it’s adopted. It’ll be incorporated into Service Assessments and the Technology Code of Practice.

PDFs are still the most frequently uploaded filetype on GOV.UK. So there’s a long way to go in making HTML and other open standards the default. (Why content should be published in HTML not PDF)

Supporting the adoption of open standards – Leigh Dodds, Open Data Institute (ODI)

CSVW lets you add metadata describing structure and schema

Open Standards for Data – ODI microsite

“Open standards for data are reusable agreements that make it easier for people and organisations to publish, access, share and use better quality data.”

ODI have produced a canvas to help you think about researching and designing a standard. The technical bit is the easy bit – the hard bit is getting people to agree on things.

Some advice if you’re building a new open standard:

  • Don’t just dive in to the technology rather than understanding the problem
  • Invest time in getting people to agree
  • Invest time in adoption. Don’t just do the specification. You need guidance training, tools, libraries.
  • Focus on the value you’re trying to bring – not, just the standard as an end in itself.
  • If you think you want a standard, be clear what type of standard you mean. Types of standard include:
    • Definitions
    • Models
    • Identifiers
    • Taxonomies
    • File formats
    • Schemas
    • Data transfer
    • code of practice
    • Data types
    • Units and measures
    • How we collect data

Opportunities for adopting open standards in government

Some thoughts from my group:

Schemas for consistent transparency publishing on data.gov.uk. Currently lots of datasets are published in a way that doesn’t allow you to compare between them. e.g. if you are comparing ‘spend above £25k’ data between councils, at the moment this isn’t interoperable because it’s structured in different ways. If all this data was published according to a consistent structure, it would be much easier to compare.

Shared standard for technical architecture documentation. This would make it easier for people to understand new things.

Do voice assistants have an associated standard? Rather than publishing different (meta-)data for each service – e.g. having a specific API for Alexa – it would be better for all of these assistants to consume content/data in a consistent way.

The (draft) future strategy for GOV.UK involves getting a better understanding of how services are performing across the whole journey, not just the part that is on GOV.UK. Could standards help here?

Kate Manne: Down Girl – Summary

Patriarchy is supported by misogyny and sexism

Misogyny is a system of hostile forces that polices and enforces patriarchal order.

Sexism: “the branch of patriarchal ideology that justifies and rationalises a patriarchal social order”
Belief in men’s superiority and dominance.

Misogyny: “the system that polices and enforces [patriarchy’s] governing norms and expectations”
Anxiety and desire to maintain patriarchal order, and commitment to restoring it when disrupted.

A reduction in sexism in a culture might lead to an increase in misogyny, as “women’s capabilities become more salient and hence demoralizing or threatening”

Women are expected to fulfil asymmetrical moral support roles

Women are supposed to provide these to men:

  • attention
  • affection
  • admiration
  • sympathy
  • sex
  • children
  • (social, domestic, reproductive and emotional labour
  • mixed goods, like safe haven, nurture, security, soothing and comfort

Goods that are seen as men’s prerogative:

  • power
  • prestige
  • public recognition
  • rank reputation
  • honor
  • ‘face’
  • respect
  • money and other forms of wealth
  • hierarchical status
  • upward mobility
  • the status conferred by having a high-ranking woman’s loyalty, love, devotion etc

If women try to take masculine-coded goods, they can be treated with suspicion and hostility.

There are lots of “social scripts, moral permissions, and material deprivations that work to extract feminine-coded goods from here” – such as:

  • anti-choice movement
  • cat-calling
  • rape culture

There are lots of mechanisms to stop women from taking masculine-coded statuses – such as:

  • testimonial injustice
  • mansplaining
  • victim-blaming

An example of this asymmetric moral economy:

“Imagine a person in a restaurant who expects not only to be treated deferentially – the customer always being right – but also to be served the food he ordered attentively, and with a smile. He expects to be made to feel cared for and special, as well as to have his meal brought to him (a somewhat vulnerable position, as well as a powerful one, for him to be in). Imagine now that this customer comes to be disappointed – his server is not serving him, though she is waiting on other tables. Or perhaps she appears to be lounging around lazily or just doing her own thing, inexplicably ignoring him. Worse, she might appear to be expecting service from him, in a baffling role reversal. Either way, she is not behaving in the manner to which he is accustomed in such settings. It is easy to imagine this person becoming confused, then resentful. It is easy to imagine him banging his spoon on the table. It is easy to imagine him exploding in frustration.”

Praise, as well as hostility, enforces patriarchy

“We should also be concerned with the rewarding and valorizing of women who conform to gendered norms and expectations, in being (e.g.) loving mothers, attentive wives, loyal secretaries, ‘cool’ girlfriends, or good waitresses.”

Misogyny is not psychological

Misogyny isn’t a psychological phenomenon. It’s a “systematic facet of social power relations and a predictable manifestation of the ideology that governs them: patriarchy.”

Misogyny is banal. (“to adapt a famous phrase of Hannah Arendt’s)

This understanding of misogyny is intersectional

Misogyny is mediated through other systems of privilege and vulnerability. Manne does not assume some universal experience of misogyny.

Shout out to “The Master’s Tools Will Never Dismantle the Master’s House” critiquing middle class heterosexual white women over-generalising on the basis of their experience.

A quick note on privilege

Privileged people “tend to be subject to fewer social, moral, and legal constraints on their actions than their less privileged counterparts”


What’s new in the new Service Standard

The Government Digital Service recently launched a new version of the Service Standard. What’s changed?

  • It’s now called the Service Standard, not the Digital Service Standard. This reflects the desire to create end-to-end services. This is better than creating digital services, and then (if you’re lucky) considering assisted digital as an afterthought. People are encouraged to provide a joined up experience across channels. What’s the user experience like if a user phones or emails you?
  • Removed the requirement to tell everyone to use the digital service. Because digital isn’t always the right channel. And there’s already a financial imperative encouraging service owners to encourage people to shift to digital. So we didn’t need to push that any more. Instead, we need to encourage people to think more broadly about the service, not just the digital part.
  • Focus on solving a whole problem for users, not just a part of it. The Standard encourages people to ask if the service is part of a wider journey. e.g. business tax registration is probably part of a broader journey of starting a business. So you should join up with those services too.
  • The team have added more information on why the Service Standard expects certain things, and the benefits of following the Standard. So it’s less doctrinaire and encourages people to do the right thing.
  • People are challenged to go beyond just thinking about accessibility, and to think about inclusion more generally: e.g. trans people and same sex relationships.
  • The type of approach to meeting user needs is challenged. Is the service the right way to deliver user needs? Or should you publish content or make data available via an API instead?
  • The scope of the service is questioned. If it’s too broad or too narrow it’s a problem.
  • Removed the requirement to test with the minister.

Being a Product Manager is good training for life

Being a product manager helps you live a good life:

You become confident in dealing with different domains of knowledge and different types of truth, and arbitrating between them.

You can imagine and weigh up possible futures, and sell and fight for the one that seems best.

You make decisions with imperfect information.

You lead from the intersection between optimism and pessimism. Seeing how things could be different and better, and helping others believe, but rooting your thinking in understanding of complexity, risk and assumptions.

You call out the risky assumptions underlying wishful thinking.

You’re adventurous and humble and focus on value.


Innovation in Transport – TfL Access all Areas

Notes from a panel discussion I attended at an event on accessible transport, hosted by TfL. I attended because I wanted to learn about TfL’s strategic approach to innovation and accessibility.

Individual introductions

Mike Brown, Commissioner, TfL

The high-level vision for accessibility on London’s transport: Everyone needs to be able to travel across the capital easily, safely and spontaneously.

Alan Benson, Chair, Transport for All

Transport for All is a charity that works with TfL as a critical friend. Recent accessibility improvements include:

  • new trains, which made many more stations level access
  • ‘please offer me a seat’ badge. (The badge is a little controversial, but Alan’s happy with it as long as it’s optional and people don’t feel labelled.)
  • disability training programme for TfL managers. Disabled people are teaching people who commission and run the services so that they can better understand the impact of the choices they make. No one else in the British transport sector is doing this. Alan thinks this is the most important improvement.

Gareth Powell, MD, Surface Transport, TfL

Transport exists to get people to a place they want to go. But 84% of disabled Londoners say that transport is negatively affecting their ability to get around and live their lives.

We haven’t designed with everyone in mind. Designing for step-free access is one thing, but what about designing for people with autism? Hence the training of managers, and more involvement of people with disabilities in the design process.

Panel session

Chair: Joanna Wotton, Chair, TfL Independent Disability Advisory Group

Panellists:

  • Alan Benson, Chair, Transport for All.
  • Michael Hurwitz, Director of Transport Innovation, TfL. (Says that his job “mostly involves worrying and internal procurement processes.”)
  • Nick Tyler, Director of University College London’s Centre for Transport Studies.
  • Ed Warner, Founder and CEO Motionspot Ltd

What are the biggest challenges facing accessible transport in London, and how might innovation help?

Michael: TfL is more used to working with massive companies than small companies. He’s keen to pave out the route to market for these promising minimum viable products. The commercial and contractual discussion about innovation isn’t trendy, but is super-important.

Nick: We need a better ability to test out new ideas. (His research group’s shiny new lab should help with this.) He wants TfL to be braver in encouraging innovation. Hong Kong and Singapore are more innovative than this country.

Michael: Singapore introduced universal design principles – trying to institutionalise the right types of design considerations. But Hong Kong and Singapore are more top-down, whereas London is much more bottom-up. e.g. there are 34 highways authorities in London. TfL does have the benefit of London’s size – the city is big, so you can test things out in small parts of it.

Alan: Try new things faster. Health and Safety concerns often lead us to hold back from testing things until they’re 100% ready. We could follow the lead of other industries that will launch things that are 80% ready.

Ed: Interesting innovations include the use of colour in wayfinding. e.g. Barajas airport in Spain. Japanese train stations play 7 seconds of melody before announcing a train’s departure platform and it cut accidents by 25%. The music settles everyone down a little bit.

Nick: We should work to make transport more enjoyable. This will make it more accessible. So look at cafes or playgrounds, and see how you might make things better on transport. This changes your way of thinking from “we need to make this system work to the timetable” to focus more on enjoyment (and, implicitly, value to humans).

Michael: Innovation isn’t always about technology. A lot of the most powerful innovations are in behaviour change. e.g. Dementia Friends at TfL. Or more assertive messaging to encourage people in Priority Seats to look up for someone who might need their seat. (They’re trialling this shortly.)

Nick: To get parents to change their behaviour, teach their kids. Parents listen to their children much more than they listen to the government.

How might we have better interfaces between public transport and the rest of the world?

Cities are people and we build the infrastructure around them. If we concentrate on building for people, then everything will get better.

Rethinking Capitalism

Notes on a lecture by Eric Beinhocker on ‘The economy as a complex and evolving system’

The income of the bottom 90% of earners in the US has stagnated from 1973 onwards.

Pre-1973 growth was more inclusive – the bottom 90%’s income grew faster than the average. Now the reverse is true, with the top 1 and 0.1% growing much faster.

Productivity gains stopped following to workers and started flowing to owners of capital.

A shift from constructive to extractive capitalism.

This has weakened the post-war social contract.

Beinhocker sees this as caused by the rise of neoclassical economics and neoliberal ideology and the associated structural changes driven by this dominant way of thinking about the economy and society.

The breakdown of Western capitalism

The rise of neoclassical economics and neoliberal ideology led to these structural changes:

  1. Globalisation of finance
  2. Shareholder value revolution (focus on stakeholders as the only legitimate interest in a company)
  3. Neoliberal public policy agenda
  4. Shift from virtue ethics to selfish utility maximisation

This shifted power in the economy, increased power of capital versus labour, and shifted rents to the top of the system. These rents were used to capture the political system.

Rethinking economics

To change the system, we need to revisit the ideas and assumptions behind it.

Neoclassical economics assumes:

  • Micro – individual behaviour
    • People are rational utility maximisers
    • People are self-interested and atomistic
  • Meso – social structures
    • Markets are efficient in allocation
    • Markets are self-correcting
    • Firms are optimally run
    • State ‘interference’ causes welfare loss
  • Macro – system level behaviour
    • Macro is a linear adding up of macro
    • Natural state is full-employment equilibrium

But the underlying assumptions don’t match with reality:

Neoclassical Theory Assumptions Real World Empirical Data
Individuals maximise utility, preferences are consistent Utility functions are not stable; play no role in decision making
Individuals behave ‘rationally’ and deductively Individuals behave heuristically and indictively
Individuals have perfect information Individuals have highly limited and asymmetric information
Individuals are self-regarding and asocial Individuals are other-regarding and highly-social
Markets are complete Markets are incomplete
Micro adds linearly to macro Macro emerges from non-linear micro interactions
Markets always find equilibrium Markets can be out of equilibrium for significant periods of time

Hence the following incorrect policy memes:

  • Markets are always self-correcting
  • Inflation is always and everywhere a monetary phenomenon
  • You can either have growth or equality
  • Trade is always welfare-increasing
  • Raising wages reduces the demand for labour (e.g. minimum wage)
  • The goal of business is maximising shareholder value
  • Tackling climate change will cost growth and jobs

… and the incorrect belief that equilibrium systems cannot:

  • endogenously grow
  • create novelty
  • generate spontaneous order
  • spontaneously crash

New thinking: the economy is a complex adaptive system

  • Complex: Many interacting agents and organisations of agents
  • Adaptive: Designs and strategies evolve over time
  • System: macro patterns emerge non-linearly from micro-behaviour

So in this way of thinking the economy is:

  • An unfolding process
  • Dynamic
  • Non-linear
  • Distributed
  • Heterogeneous
  • Networked
  • computational
  • Autopoetic (self-creating)
  • Self-organising
  • Evolutionary
  • Reflexive
  • A social phenomena
  • A cultural phenomena
  • A physical phenomena

Reimagining capitalism

Capitalism isn’t great at efficient allocation, but it’s good at getting people to cooperate at innovate and solve problems. Beinhocker is happy with capitalism in general, just not the current implementation.

Prosperity isn’t money – it’s the accumulation of, and access, to solutions to human problems.

So we can redefine capitalism:

  • The purpose of capitalism is to provide solutions to human problems.
  • Wealth is the accumulation of solutions to human problems
  • Growth is the rate at which new solutions are created and made available
  • Prosperity is the set of solutions plus access
  • Goal of business and investment is to create new, better solutions and make them more widely available
  • Markets and governments together create an interdependent evolutionary ecosystem for solution creation and access.

Traditionally we’ve assumed that price is the measure of value. But that doesn’t help us measure between good and bad economic activity. (e.g. cigarettes or polluting activities. Or creating a fancy derivative product that ultimately makes the economy more unstable.)

The redefinition of capitalism above helps us distinguish between good and bad economic activities. Ask these questions:

  • Is my solution creating a problem for someone else?
  • Is my solution creating problems for society (e.g. derivatives)
  • Is my solution today creating problems for the future?
  • Am I solving a real human problem or just rent seeking? (e.g. high-frequency trading isn’t)

Human cooperation is fundamental to problem solving.

To demonstrate this, Thomas Thwaites tried to build a toaster from scratch.

Because cooperation is crucial, and because capitalism exists to solve human problems, then inclusion and a fair social contract are fundamental to capitalist prosperity.

Fixing capitalism

We need to shift our thinking

From To
Left vs right How best to foster inclusion, fairness and trust
Market vs state Effectiveness of market and state ecosystem
GDP Solutions to human problems
Market efficiency in allocation Market effectiveness in innovation
Purpose of firms is to maximise shareholder value Purpose of firms is to make products and services that solve human problems
Labour is a cost to be minimised Employment is a key means of inclusion
Markets are amoral Moral fairness underpins capitalism
Economy is separate from society and environment Economy is embedded in larger complex system of society and environment

So we need a broad set of reforms to support:

  • Inclusion:
    • Economic
    • Social
    • Political
  • A fair, reciprocal social contracts
  • Effectiveness in problem-solving innovation
  • Demand for, and access to, new solutions
  • Problem-solving vs problem creating (e.g. the environment)

The value of opening up government data

I presented at the British Library’s event “Open and Engaged” as part of Open Access week on 22 October 2018, on the value of opening up government data. Here are the slides, which I’ve adapted into the following post, with a post-script of additions generously suggested by the ever-excellent Steve Messer.

Why has government opened up data?

Probably the first motivation for opening up government data was to increase transparency and trust.
The MP expenses scandal led to a political drive to make government and politics more transparent.

Data.gov.uk was commissioned by Gordon Brown and overseen by Tim Berners-Lee, and built in in 2009/10.

In 2010 Prime Minister David Cameron wrote to government departments on plans to open up government data, promising “Greater transparency across government”. He wrote of a desire to
“enable the public to hold politicians and public bodies to account”, “deliver better value for money in public spending” and “realise significant economic benefits”.

What transparency data is published?

Theresa May published a letter in December 2017 clarifying what data departments and the Cabinet Office were expected to publish. This includes things like the pay of senior civil servants, and central government spending over £25,000. (Monitoring and enforcing this is an interesting challenge. Subsequent to this talk, I’ve been having some thoughts and conversations about how we might do this better.)

Transparency data around the world

In the USA you can track spend data back to policy commitments:

It will have taken a lot of political work to have consistent identifiers between different parts of government, so that this type of scrutiny is possible. Not glamorous, but very valuable – a trend you’ll see more of in data work.

My favourite example of transparency work in other countries is DoZorro.org:

Ukraine’s recent reform work is highlighted by its more open online public procurement system. An ecosystem of tools and an engaged community has emerged around this data. Citizen monitoring platform www.DoZorro.org has been used to bring 22 criminal charges and 79 sanctions so far.

This open procurement data has also led to the creation of a tool for identifying corruption risks http://risk.dozorro.org/:

It’s also led to the creation of a business intelligence tool http://bi.prozorro.org:

This takes us to the second big benefit of opening up government data.

Economic value of opening up government data

Open data improves data sharing within government. Previously, having to send a Freedom of Information request to someone else in your own department to access information was a thing that actually happened.

Looking at the datasets on data.gov.uk that are used more, they generally have a clear economic use. These include datsets with information on land and property payments, or information on MOT testing stations. Other popular datasets are more related to understanding society, and are likely used by councils, third sector organisations and other agencies interested in planning service provision – e.g. English Indices of Deprivation 2010 and Statistics on Obesity, Physical Activity and Diet, England.

Measuring value is hard

I don’t think the above section was compelling enough. This is because measuring the value of open data is hard. There are a number of different techniques you can use to measure value. None of them are great – either you have something cheap and broad, which doesn’t give deep insight, or you have to commission a deep and expensive study.

Johanna Walker, University of Southampton, is a great source on this type of thing. She presented on ‘Measuring the Impact of Open Data’, Paris, 14 September 2018. (Aside: Johanna Walker suggested semantically-augmented version control as a way of ensuring quality, consistency and giving a better idea of how a dataset is being used.)

This post has more thoughts about the value of open data.

International Leadership

The UK’s early work on opening up government data has helped set the direction internationally.

International rankings are a relative thing, and the rest of the world has been catching up.
The UK has been first in the Open Data Barometer for 5 years in a row. Now we’re joint first with Canada

Global Open Data Index

Some other countries are doing really good things with procurement data. This work is hard – it took Brazil 5 years to get consistent identifiers so that you can link policy to budget to spend.

Challenges highlighted by user research

International rankings are lovely, but what do we know about the use of open government data and the challenges associated with this?

Government data is hard to find and use. Working with government data is hard – even for users who know government:

  • Metadata is inconsistent and incomplete.
  • Titles are sometimes obscure.
  • Some datasets are published more than once.
  • Data users can only understand how useful the data is once they’ve downloaded it and not all data users have the capability to download and view certain types of data.

Competing catalogues, standards, technologies and propositions.

We don’t have consistent, reliable data:

  • Basic data skills and literacy aren’t strong enough to consistently produce data that is findable, usable, comparable and reliable.
  • This means that many of the solutions designed to make data easier to find are only theoretically possible.
  • It means that many services that need consistent, reliable data are also only theoretical

Where there is a relationship between the data publisher/producer and the data user, the quality of the data and metadata is better

Publishing is not enough

Publishing open data is a crucial start. But it isn’t enough.

We need to optimise for use and value.

“It is not enough to have open data; quality, reliability and accessibility are also required.”

Theresa May, December 2017

“Availability is only one aspect of openness, and the government also has a role to play in increasing the value of data, through structuring and linking datasets.

Treasury Discussion Paper – The Economic Value of Data, 2018

“If no-one can find and understand data it is not open.”

Laura Koesten

How to get more value from data

Some ideas by Elena Simperl and Johanna Walker (Analytical Report 8: The Future of Open Data Portals, 2017) on how we might get more value from open data portals. These actually apply more broadly. Some highlights to pick out:

  • Linking and interoperability, including consistent schemas. So that you can get network effects from combining datasets.
  • Colocation of documentation and tools to reduce barriers to entry.
  • Organisation for and promotion of use – thinking about how to get value out of the data rather than seeing the job as finished when the data has been published. So reflecting back the use that has been made of data to teach and inspire others, some level of fostering of a community around the data.

(Analytical Report 8: The Future of Open Data Portals, 2017, Elena Simperl and Johanna Walker)

Opening up the Geospatial frontier

There’s lots of excitement around the potential of geospatial data. In the UK there’s a newly-created Geospatial Commission looking into how to open up this data. One of its key early tasks is opening up some of the Ordnance Survey’s Master Map data. This looks likely to be on a tiered basis, so free up to a point but paid beyond that.

Some highlights of what this Master Map gives include: Building Height data, Water Network Layer.



Even here we have the question of linked identifiers. The Geo6 are looking at this. This is the kind of graft we need to get the most value from our data.

In summary

Transparency and economic value are the key drivers behind government publishing open data.

But publishing data is not enough – we need to work hard to understand user needs and maximise use and the value derived from data.

Bonus: Extra insights from Steve Messer

TfL

They opened up their data en masse and waited for people to use it before improving it. Here’s the original download page and a post by Chris Applegate on why the first release could be improved. It has since been improved thus proving your point about ‘optimise for use and value’. Optimise after you’ve opened it up. The MVP for opening up data is creating a good standard and putting it out there.

Economic value

I’ve tried to find some £figures to help you back this point up, as below:

Local government

I once made a timeline of notable events in local gov open data. It’s in this blog post.