Back in November of last year, in a room full of policy-makers and development practitioners in the Armenian capital of Yerevan, we launched an innovation lab aimed at accelerating SDG implementation.
A part-government, part-UN hybrid, the launch of the newly-christened Armenian National SDG Innovation Lab was accompanied by a cascade of commitments including everything from revamping Armenia’s statistical architecture to training an in-house government behavioural science team.
In any industry, we encounter gaps between results and rhetoric, between what sounds good and what’s actually doable. The development sector is certainly no different. With this firmly in the back of our mind and four months in the rear-view mirror, we’ve decided to take stock and ask how are we doing? Have we been living up to our promises to be transformative? Are we really a genuine space for different kinds of actors to work together?
To do that, we’re looking through the lens of one of our very first projects: theSDG Barometer — a real-time platform for measuring and visualising implementation of the Sustainable Development Goals in Armenia.
To build a tool with relevance to a wide audience, we dug deep into different kinds of potential users to better understand their search habits and data needs.
Enter the design phase. In our small offices we could not help but have illusions of grandeur: our platform would have data on all sorts of indicators, updated in real-time, possibly using prophetic “big” data, providing evidence to inform policy-makers and practitioners working across the spectrum of development practice.
As we’re an innovation lab, we decided to go out and test some of these ideas first, in this instance by speaking to our users. We focused our concept on one area, energy (now memorised as SDG 7), and set out speaking to people who used energy data on a daily, weekly or monthly basis.
For anyone still fence-sitting on the value of user research, pay attention. This is what we learnt:
- People didn’t really need or want metrics on the SDG indicators for energy. Generally, the only people who might have use for better metrics on SDG implementation were — wait for it — people reporting on the success of SDG implementation.
- People had no need for real-time data. Those working with energy data at most only needed it every month or so. For most, every six months would be fine.
- People didn’t trust the data that was available. All of the visualisations in the world could not solve a problem about the perceived-inaccuracy of state data
Three findings like these could have easily spelt doom for the uncommitted, but we struggled on. We realised we were out of our depth technically, and are now in the process of adding two data specialists to the team to help us stay within the realm of the possible.
What’s more, the team started to pool the ideas and resources of other development actors. Development cooperation and coordination sound good on a checklist but are usually decidedly more difficult to achieve to any substantive degree. We’re trying to buck the trend, starting with our colleagues at UNICEF, who will be providing child baseline data for all goals and targets. To deal with issues of trust, we reached out to the local national statistical office for more involvement. The hope is that this cooperation will snowball into something that is genuinely collaborative.
We hope to go live with our first beta test later this year and will incrementally add more and more data from as many partners as possible.
Our designers came up with two prototypes. Further testing and iteration awaits!
The experience has provided us with a great amount of food for thought, but we’re continuing to persevere, while challenging our assumptions along the way. Stay posted for updates!