This is a
second blog post following the launch of the UN report on the ‘Data Revolution’. I'll be exploring my own ideas, and also trying to outline some ways that complexity theory is being used to work in this space. I'll also see if I can illustrate
some points from my previous post as well. I am slowly rolling these ideas along, which appear to have first been scribbled in apparent frustration on a post-it below (there were many more). This is from a collaboration and discussion that took place earlier in 2014 at the Open Development Fringe Event at OKFest in Berlin.
Brainstorm At Open Development Fringe Event 2014
Creative Commons Licensed (BY) flickr photo by tomsalmon: http://flickr.com/photos/fishytom/15185058154
One of
the biggest challenges facing policy makers today is complexity, and this
relates in interesting ways to ideas about a ‘Data Revolution’. There are quite a few initiatives working out ways to understanding the limits to governing and
acting instrumentally to fulfill policy goals. The
OECD published a piece on ‘Applications of Complexity Science for Public Policy’
in 2009 and I'll take that as a fairly good indication of it being on the horizon of mainstream public policy debates. I am not particularly familiar with approaches that are established in terms of dealing with 'complexity'. So my understanding is that complex situations might lead people to offer preventative policy approaches, others might adopt resilience
terminology while a final group makes up a more transformative paradigm. I'll try to avoid rolling them all into one, as I am sure that not all approaches are
equal or have the same notion of complexity embedded in them. I also have not yet come across any 'approaches' that see a particular virtue in making any claims about 'disruption' either, but I will keep my eyes open...
So take the recently published UNDP Human Development Report 2014, which proposes
as its central idea the concept of ‘human vulnerability‘ as a paradigm of complexity
for development in the future. It goes on to describe the prospects post-2015,
particularly of development and growth eroding people’s capabilities and
choices, and argues for a sustained enhancement of individuals’ and societies’
capabilities with a particular focus on collective action to counter this.
It puts
forward an argument about ‘structural vulnerability’, which as it is proposed in the report seems to
go further than ideas of ‘risk mitigation’ or a more limited ‘resilience type’
approaches to generate a clearer picture of how inequality interacts with interventions and actors. So as the report points out, eliminating extreme poverty
is not just about ‘getting to zero’; it is also about staying there. It strongly
adopts the language of capabilities for individuals, and cohesiveness and
responsiveness for institutions and societies describing both how affirmative
action and rights based approaches fit in to this picture as well as openness,
transparency and accountability initiatives.
A simple message might be that this is one way of looking at adaptation in
complex systems going through processes of change and innovation. Discussions around the 'data revolution' also seem caught up in an emerging tangle of efforts around this, although of course there are different takes on the
ideas from different stakeholders.
For
example one corner there is some interesting work going on with
looking at how to solve complex problems better at the Kennedy Centre BuildingCapacity Program and the ODI where they are talking about ‘Doing Development Differently’ (yes: #DDD) and Problem Driven Iterative Adaption (yes: #PDIA).
At first
glance some of the language about ‘politically smart development’ from them appears at some point borrowed from programs like the Developmental Leadership Program (DLP).
However, the emphasis is in fact more on thinking about design as a paradigm
and using design thinking with problem solving approaches like the 5 Why’s,
problems trees and fishbone diagrams (the Ishakawa diagram).
The Kennedy Center strongly references Hirschman’s ideas and approaches quoting the “Principle of the Hiding Hand” proposed by him:
“men
engage successfully in problem-solving [when] they take up problems which they
think they can
solve, find them more difficult than expected, but then, being stuck with them, attack willy-nilly the
unsuspected difficulties – and sometimes even succeed.”
solve, find them more difficult than expected, but then, being stuck with them, attack willy-nilly the
unsuspected difficulties – and sometimes even succeed.”
I have to say, to me this is not so much a statement of approach, but it certainly sums up a lot of my experience of 'problem solving' within large slow moving organisations that are not particularly good at encouraging divergent thinking. Another
way to put it, as Matt Andrews does in his blog is that people have strong incentives to opt for 'big best practice type initiatives' and will also tend to 'overestimate the potential results'. So let's imagine upfront that we accept a degree to which path dependency and institutional constraints affect thinking in our organisation. Then the only real 'problem solving' that is really going on is in the attempts to devise a means, at each step of the process, to reconcile 'unsuspected difficulties' with the received 'best practice' methods. In some ways this seems to relate to a lot of the ideas around Agile Development
and Open Development as well, but is probably from a different perspective. It also speaks to Easterly's long-standing distinction between 'planners' and 'searchers' in development, which is also championed by many market-based approaches to development work.
Let’s
illustrate this in a context which is broadly the one that this blog post is
addressing. The UN report is highlighted in a recent economist article on the
‘Data Revolution’, and states that a key problem is simply that there is an
ever growing and expanding divide between the data rich countries and the data
poor ones in terms of data. The article helps to paint a picture of why some of
the slow moving machinery based on national statistics and silos of project
data around the MDG’s is unlikely to be fit for purpose to be responsive to the
shifts and shocks (like Ebola) that could impact on the post-2015 world. In
fact it takes Ebola as a prime example of why real-time data is so important in
an interconnected world.
The
Economist goes on to explain that Humanitarian action is being hampered greatly
by the lack of appropriate data on the location of hospitals, with poor maps of
cities. Volunteer efforts with Humanitarian OpenStreetMap Team and charities
like MSF and the Red Cross have led the way in creating MissingMaps.org which
is trying to create free maps of cities across the developing world,
particularly in potential trouble spots before they need crucial data in a
disaster situation. Similarly it mentions how call-data from mobile-phone
operators in an interconnected world can be put to use by for example comparing
data on malaria outbreaks and on people’s movements to make predictions of
spread. This seems all the more pressing considering the current situation of disaster
around Ebola.
Interestingly
the Economist then goes on to describe a role for the private sector as well.
It points out that Premise (a startup in Silicon valley) is being used to spot
trends such as the fact that as the number of Ebola cases in any one location
rise, the local price of staple foods also increases dramatically. As they
point out in recent weeks as the number of cases fell, prices also did. The
Economist then goes on to say that the authorities already did realise that
travel restrictions and closed borders would have this effect, but now they
have a way to measure and track price shifts as they happen. The creators of Premise describe a little bit of what is behind their thinking (and success) here. For some of the analysis in the press around the relationship between Ebola and food prices and some of these concerns, there are articles here and here also.
"World Food Programme in Liberia 002" by 26th MEU(SOC) PAO (U.S. Marines) - 26th Marine Expeditionary Unit (Special Operations Capable) Public Affairs Office. Licensed under Public domain via Wikimedia Commons.
I think this returns my attention to the issue that I have sketched out in my previous post. That in essence the
Economist is celebrating the work of ‘Premise’ and others like them because they
will allow food distribution companies to continue doing business without
interruption globally despite the Ebola crisis and the subsequent closing of
borders and quarantines.I am not entirely sure that is exactly what the creators of Premise themselves envision for their platform, which seems to be aiming to be working for the social good in a very innovative way.
The way the Economist presents this is actually not as an initiative focusing on development per se, although it
certainly could be good for the companies involved. A potential benefit might be if we are able to use the data from ‘Premise’ and other companies
in an open way, reusing it and remixing it in order to help predict disasters
and improve coordination of relief efforts for example. But the fact that people can make money
out of this as a business is not the most interesting aspect of it. The question here should be how is it that corporations and
businesses could play a more useful role in contributing to the ‘data divide’ in
the global south instead of just focusing on a bottom line.
My
question is also how much institutions like the Economist and the WEF are
presenting a fairly narrow view of open development, and open data here and are actually pigeon-holing it into simplistic streams? The volunteers who
map out cities and the companies who need only focus on how much extra profit
the emergence of data about the developing world will potentially bring them in the long run.
The point is not that Premise (the Google funded startup)
is designed to make a profit out of its analysis of financial data. Or as the
FT article puts it 'Premise taps into hunger for real-time data'. The point is this is how the Economist and the FT have chosen to present it to us.
So to return
the argument I proposed in my last blog post. At first glance the WEF is fairly
comfortable with designating our personal data is a new class of asset, (see the report here) and
like the Economist it probably sees that there is nothing wrong with this
or with anyone else selling off analytic data on financial markets in order to supply
the market with real-time data. But beyond this they are both fairly silent about is
what responsibility the private sector might have to help to mitigate the
impact of a humanitarian crisis like Ebola.
Potentially
it seems quite likely to be hard to predict a crisis and to map out an area in
the event that a crisis might occur there. One the other hand, to ask companies
collecting data from the developing world to consider broadly that their work
could play an important role in alerting people and agencies to things like price
increases in the context of a global epidemic like Ebola seems like a very useful
and effective measure.
The role
of open source software and open systems is also important here, because it is helping
to build systems for data collection, and useful corridors and channels for
information to flow. It is showing itself to be of key importance in the
current fight against a threat like Ebola. This story from AllAfrica highlights some of the ways
that NGO’s and others are using open systems. A great resource is a recently
published book edited by Matthew L. Smith and Katherine M. A. Reilly from MIT Press which is digging into what ‘Open Development’ is and may seek to achieve in the
future. Also to point out is the long standing work that rOg media in Berlin has been doing in South Sudan and work that that Stephen Kovats from rOg has already done with UNESCO too on this here.
If you
consider how much is being done with open software, it begs the question. Why are
we not also asking companies that have useful data and resources to think more clearly how they can move
towards making them open? Is it really ever going to benefit poorer countries in
Africa with less data to work with and less digital infrastructure if in the end all we are doing is focusing on selling food price data on regions in crisis
to the market instead?
To
return to the problem of complexity here, I think this illustrates why we have
to think outside of the ‘boxes’ that we might be at first tempted to approach these problems with. Part of my argument is that this drive to work with
open data in development should not only try make sense of the obvious challenges
and opportunities for development actors in a post-2015 world, but also to make
sure that it is taking on board Hirshman’s call for more lateral thinking.
Seeking
to predict and plan for disasters that will require local solutions, and
provide maps and data to aid NGO’s, local governments and local actors to act
effectively is clearly valuable. However, an equally predictable effect of a
crisis like Ebola anywhere is that markets will act in unpredictable ways and
may even exacerbate the situation, and will capitalize on their access to
real-time data and analysis to avoid being affected by it. Given the digital
data divide, it seems that a more lateral approach to the situation would also
be one whereby complex problem solving is employed to test out ways to yield benefits
by matching up those who can deliver solutions best with those that have the knowledge, resources
and data to assist in this. Not by working in silos, or neccessarily by competing with large 'best practice' initiatives and not by recreating
proprietary systems that lead us further towards broadening an existing digital
data chasm between the richer and poorer countries around the world.
So in my
next blog post and will try to take a closer look at the report itself.