What makes research effective?
5 July 2019
Kellogg alumnus Dr Gorgi Krlev (DPhil Social Impact Measurement) highlights some of the key features and insights that led to “Social Innovation – Comparative Perspectives”, edited and written by Gorgi and his colleagues Helmut Anheier and Georg Mildenberger, winning the “Best Book 2019” award of the Academy of Management’s (AOM) Public and Nonprofit Division.
Climate crisis, rising social inequalities and political polarization are only some of the many challenges we are currently facing. Social innovation can provide responses.
Social innovations are novel ways of dealing with significant and persistent social problems. These innovations may come in the form of a new product, for example an app enabling citizen participation and promoting “open government”, but they also come in many other ways too. For instance, as new forms of organizing to promote renewable energy production through citizen cooperatives, or new financial vehicles for social service provision that enable risk sharing and synergies among engaged actors, such as Social Impact Bonds.
The ITSSOIN project investigated how diverse actors make social innovation happen (“Impact of the Third Sector as Social Innovation”, funded under the European Commission’s FP-7 scheme). “Social Innovation – Comparative Perspectives” (available open access) is one of the main outputs of the project and received the “Best Book 2019” award of the Academy of Management’s (AOM) Public and Nonprofit Division. I have already shared some insights on the rationale of the project and book, and its results. Here I want to reflect on what I think earned our book the prize and explain how this is relevant to how we perceive effective research.
The award notification highlighted three things in particular:
The judges believed the methodology to be sufficiently detailed and systematic, that other studies could replicate it. As the reproducibility crisis of the social sciences shows, this is by no means standard. What enabled us to uphold that rigour is that we started off with identifying “social innovation streams”, designating broad and recognized innovations instead of focusing on singular organizational initiatives. In a second step we applied a unified framework to reconstruct how the social innovations came into being. We identified key actors and critical junctions and compared different national “stories of emergence”. So we applied a historical, procedural view and borrowed from the repertoire of political science by applying so called “process tracing”, typically used to study how new laws are passed.
They highlighted that the book “provides guidance to practitioners regarding how they can achieve social innovation in their contexts”. Our ability to do this depended on two factors. One was the comparative dimension of the research. Only by gathering the same kind of evidence cross-nationally could we actually make any robust claims of factors that are relevant and irrelevant for social innovation. The other one was that we combined a large number of data sources to understand the studied social innovations in depth. We talked to the identified actors of social innovation, but we also analyzed national policies, citizen perceptions and media reports of it. Not all of that can be found in the book, but it was critical in deepening our understanding of the system dynamics surrounding social innovation.
They also praised the book as coherent and able to convince the reader of why it matters. Developing and performing an “integrated research programme” was key to achieving this. I just mentioned that we did things “alongside”, took complementary perspectives; however, the main part of our research was sequential and building knowledge step-by-step. This spanned across all three years of the project and was guided by a clear idea of how to move from hypotheses, to the identification of the streams, to the uncovering of the main innovation actors, to analyzing how their individual traits and interplay enabled them to produce the innovation. Of course, we had to fill the initial concept with life, but in fact we made very few adaptions to the original strategy. We also built co-lead structures into the so-called work packages and almost every partner was involved in each of the work packages. Although requiring more effort than an “additive” research designing, where you put individual parts together at the end, this made sure partners identified with the work and engaged in harmonizing their work across countries and work streams.
Three lessons can be distilled from the observations I made:
First, reproducibility benefits from the application of systemized analytic frameworks, process oriented research and methodological plurality.
Second, transfer into policy and practice is easier when the research is comparative and when it uses a variety of data sources.
Third, relevance and coherence depend on neat integration of the constituent parts of the research programme and a strong tie-in of the project partners.
I hope these prompts will serve as inspiration for future successes.
Financing from one of the European Commission’s cross-national grant schemes is by no means a guarantee that similar outcomes can be achieved. Many might also shy away from the amount of effort that needs to be put into such a proposal, and there are certainly still many improvements to be made in how funded research is administered and carried out. However, the research schemes by the European Commission have been a real game changer not only in promoting these outcomes, but also in moving European research institutions closer together. So we are in a privileged position and should harness the opportunities it offers.