How CABS Incorporates Users’ Perspectives
The voices of program or service participants are often absent from program and policy decisions, even though the participants are the primary stakeholders of those programs and policies. To truly engage consumers and users of education and human services, we as program designers and researchers cannot simply collect information through interviews or focus groups in order to check a box. Engagement involves regular, mutually beneficial conversations where participants have room to challenge assumptions and call for changes in practice.
The CABS approach to problem-solving covers a wide range of engagement strategies to elevate the perspectives of these end users: from building empathy to mapping processes to collecting meaningful responses to existing or new processes. We often broker conversations among administrators, staff members, and participants about critical decisions — conversations that might not otherwise happen. Before we design a solution, we diagnose barriers to participation, often starting by observing end users and collecting data from them. When we do design a solution, we check whether intervention prototypes resonate with end users and reflect their interests. Only then do we test the intervention at a broader scale to see whether it improves outcomes. We have learned that we can get different results depending on whom we engage, the stage of a project when we engage them, and why we engage them. Here are some examples from our project work.
Who? (voice): Fathers of young children
Why? (intended goal)
Improve father attendance at a fatherhood program; improve fathers’ engagement with their young children
When? (project stage)
Diagnose, Design: diagnosing barriers to attendance and designing the intervention
How? (approaches)
We asked fathers what fatherhood programs were missing, how they could feel better supported, and what barriers they were experiencing to participation. We connected with fathers multiple times. Fathers marked up cards with the features, information, and timing they would like. Then they commented on prototypes of texting, apps, and other tools with those features. Finally, they used tech and shared challenges.
What intervention decision did participant voices change? (outcome)
Although previous evidence and testing had focused on long text-message campaigns, what we heard from fathers led us to design a smartphone app because that way we could offer more features and information that fathers said they wanted.
Who? (voice): Noncustodial parents
Why? (intended goal)
Improve attendance rates at initial child support meetings
When? (project stage)
Diagnose, Develop: diagnosing barriers to participation and collecting responses
How? (approaches)
We asked fathers about their experiences in the child support system and observed child support waiting rooms. We invited a panel of fathers with child support orders to review two versions of our outreach strategy.
What intervention decision did participant voices change? (outcome)
We selected the outreach materials that the panel of fathers chose. Those materials were used in a project with three child support agencies in Georgia.
Who? (voice): Parents of young children
Why? (intended goal)
Improve attendance at meetings to renew child care subsidies
When? (project stage)
Develop: developing processes to gather responses and make improvements
How? (approaches)
Following an initial round of new materials meant to increase attendance at renewal meetings, we spoke with parents and learned they sometimes skimmed the renewal letter because they assumed eligibility requirements had not changed.
What intervention decision did participant voices change? (outcome)
To catch readers who might skim the materials, we added visual cues to the text of the letter. The updated letter increased the number of parents who attended a renewal appointment.
Who? (voice): Community college students
Why? (intended goal)
Improve summer enrollment rates among community college students
When? (project stage)
Diagnose: diagnosing barriers to participation
How? (approaches)
In focus groups, students described their barriers to enrolling in summer. Specifically, students were not certain how they could pay for summer courses.
What intervention decision did participant voices change? (outcome)
Instead of a standard registration letter, we sent students personalized messages showing how much funding they had available to pay for courses and provided some students with grants to cover summer enrollment.
Who? (voice): College advising staff
Why? (intended goal)
Teach College Promise staff and students about behavioral science and the “CABS Approach”
When? (project stage)
Clarify: building empathy and brokering conversations
How? (approaches)
In focus groups of students we presented a map showing financial aid processes created by the colleges, and asked students to annotate it to reflect their experiences. Then, during a training session, students spoke directly to staff members about what they thought of staff members’ ideas for program changes, and presented their own ideas.
What intervention decision did participant voices change? (outcome)
The staff used the student-annotated process maps to update processes and ensure that staff members addressed students’ needs and perspectives in their advising.
For more information about CABS’s work incorporating users’ perspectives, visit cabs.mdrc.org or email us at [email protected].