While evaluations are an optional part of the Participatory Budgeting (PB) process, they are well worth the time and energy required to conduct them. Christine Paulin, professor at the Université de Moncton, and Jake Carlson, PBP Research Fellow, led a PB Network Study Session to share their insights on PB evaluations. See more below for their tips on how to design useful evaluations for PB processes.
What is a PB evaluation?
PB evaluations are one way to discover and share what happened with your PB process. They can help you understand how well your process met its goals and help you share key highlights with others. Evaluations usually consist of two parts: the data you collect and the report you write. The data comes from surveys and interviews with folks who led PB and participated in it.
Why conduct an evaluation?
Evaluations are important resources for everyone involved in PB – including researchers, community residents, staff leaders, and elected officials.
On our PB Network Study Session, Christine discussed the evaluation she conducted in Dieppe, New Brunswick, Canada. In Dieppe, she surveyed PB participants and analyzed trends in behaviors, opinions, and priorities. After surveying, she presented the results to Dieppe’s local government. In doing so, Christine provided local officials with critical information about the process and amplified community needs.
Evaluation data can also help build community buy-in for PB. Since PB is a relatively new innovation, folks are often skeptical about getting involved. But effective evaluation data can help staff and community see the power of PB.
When we conducted an internal review of all evaluation survey data PB researchers have gathered in different cities, we found between 10% – 25% of people rarely vote or had never voted in traditional elections. For residents who believe PB is “not for them”, this data may assuage their concerns and help them visualize themselves as PB participants.
Make It Easy
PB researchers don’t have to start from scratch when designing their evaluation; they can use existing tools and metrics. We recommend using Key Metrics as a starting point for formulating evaluation questions.
It’s best to plan an evaluation strategy as early in the process as possible. This will give researchers more time to reach out to potential partners, conduct interviews, and create reports. Look to faculty members at local universities, as well as local foundations and civic organizations to find potential research partners. As collaborators, they can help think through and build out effective evaluation strategies.
Make It Useful
Evaluations should answer questions that you and your community want answered. You should focus on collecting useful information, and share it in accessible ways for your audience (e.g., with graphs, data visualization, and clear language). Your audience may include community members, elected officials, journalists, and funders.
Evaluation data should also help stakeholders assess the use and growth of PB over time. Two key metrics that are useful in assessing these factors are the race and income level of participants. These metrics indicate whether the background of PB participants is representative of the broader population. People of color and low income communities tend to be excluded from formal political processes. Based on the results of the evaluation surveys – including demographic data, PB leaders can try new outreach strategies or build upon successful practices.
Make It Good
When researchers create an evaluation survey, their primary goal should be to receive responses from a large and representative sample of PB participants. To do this, prioritize the most relevant and useful questions based on your local context. Keep in mind that shorter surveys require less time and energy to complete. And consider attaching surveys to voting sheets to help boost response rates.
If participant interviews are part of your evaluation strategy, be sure to conduct them in sensitive and inclusive ways so that folks of all backgrounds are welcome and comfortable participating. For example, Christine kept language barriers in mind and gave participants the option to complete their evaluation in French or English. Even though most people in Dieppe speak French, Chrstine made it a point to include members of the community who primarily speak English.
Want to learn more beyond these tips and tools? Check out these resources: