
Improving access to redress for workers vulnerable to violence and harassment in South Asia
Experts discuss the factors that make some workers more vulnerable than others to violence and harassment.
This page is approximately a 4 minute read
This page was published on
Allan Little, Director, Mission Economics
Sara MacLennan, Visiting Senior Research Fellow, Centre for Economic Performance, LSE
Evidence is central to engineering a safer world. That's why Lloyd's Register Foundation has committed £15 million over the next decade to establish the Global Safety Evidence Centre – a hub for building and sharing knowledge about what works to make people safer.
But here's the challenge: evidence-based grant-making can involve complex frameworks, methods, and technical jargon that shuts out smaller organisations, international partners, and frontline innovators who lack formal evaluation expertise.
Our three-month scoping project set out to change this. Working alongside the Evidence Centre team, we developed a practical, accessible framework that enables any organisation – regardless of size, location, or analytical capacity – to demonstrate the value, and value for money, of their safety work.
This seven-stage model is designed to support funders and grant applicants of all sizes and locations. Applying the framework proportionately means that it should scale from small community projects to major research programmes. The aim is to help all grants contribute to the pipeline of ‘what works’ evidence that the Centre will build.
A project can only deliver ‘value for money’ if it tackles problems the Foundation actually wants to solve. We call this 'Stage 0' because it determines whether value for money is even possible, and whether grants reflect the Foundation's 2024-2029 strategy priorities.
Vague ambitions produce vague results. Compare these two statements:
The second creates a concrete commitment that can be measured and achieved. We provide examples to help craft objectives that are Specific, Measurable, Achievable, Relevant, and Time-bound.
This stage links activities to impacts through a logical chain. Done badly, it represents wishful thinking. Done well, it brings intellectual honesty about evidence quality and addresses key assumptions.
A fully quantified cost-benefit analysis can be unattainable or require resources disproportionate to the grant itself. Instead, we recommend a manageable framework built on the ‘four Es’:
This allows organisations to demonstrate value more systematically without requiring specialist economic expertise.
Strong monitoring requires specificity. Rather than planning to "monitor training completion”, effective approaches specify "monthly tracking of participant completion rates through the online learning management system, with quarterly verification through certification records”. We provide examples that organisations can adapt to their context and resources.
If monitoring tracks what happened, evaluation explains why it happened and whether money was well spent. We outline different evaluation strategies, ensuring at least some are within reach for all grants.
Learning goals and dissemination plans shouldn't be afterthoughts. From the beginning, strong grant applications identify what knowledge gaps they'll address and how insights will reach others who can use them to improve safety outcomes.
Our project developed through a series of 'topic notes', allowing us to work alongside the Centre, exploring themes in decision science and economics. For example, we drew on our expertise in wellbeing economics to explore how subjective wellbeing measures can be used to measure and value safety impacts. In our final report, we deliberately scaled back to establish some core principles for all organisations to follow, but there is scope to push the boundaries by exploring different ways to measure and value safety outcomes.
We've also translated this framework into 'Beacon', a prototype web application that walks the Foundation through each stage with templates, examples, and sector-specific guidance. While experimental and unpublished, it shows how digital tools could make evidence-based approaches even more accessible globally.