By Courtney Hagan
So you’ve connected with your evaluator and they have produced an evaluation plan for you, but perhaps evaluation is so new to you that the plan seems to be another language! SEG is here to help break down the components of the evaluation plan to make it more understandable and clearer.
The evaluation plan contains useful information for you, such as the timeline for when to expect deliverables from your evaluator, the logic model that illustrates your program design, and the evaluation matrix or framework (discussed more below).
For a typical SEG evaluation, the evaluation plan is broken into three sections: the fidelity of implementation, formative or process evaluation, and summative evaluation. Each of these sections have their own evaluation questions associated with them. You’ll find each of these questions in the evaluation matrix of your evaluation plan, along with the analytical procedure and data collection procedure.
For example, a question under fidelity of implementation may be ‘to what extent was the project implemented as designed?’ This question helps the evaluator assess if the project is on track with implementation as it was originally laid out in the grant application. Looking at the example below, we see that an implementation task this project needs to complete in its first year is to hire a project director. The evaluator will request project documentation for each implementation task (under data collection procedure) and then review documentation to assess the progress (analytical procedure).
Another example comes from the formative evaluation, with the question: What feedback was offered by campus stakeholders about the implementation and benefits of project activity components? What recommendations for improvement were offered by campus stakeholders? This section focuses predominately on collecting qualitative data on stakeholder feedback about project activities. Questions in the formative evaluation typically address successes, challenges, and sustainability. In the example below, this question focuses on the stakeholder group of students, with evaluators conducting a focus group to collect information (data collection procedure), and then conducting qualitative analysis on the transcript (analytical procedure).
A final example comes from the summative evaluation, with the question: To what extent has the project met its intended objectives? This section typically focuses on project outcomes and, for federal grants, may also contain Government Performance and Results Act (GPRA) performance measures. For this question and example data point from a higher education evaluation, the evaluator would work with an institutional research (IR) office to obtain lists of first-time, full-time students retained from fall-to-fall semester (data collection procedure) and then conduct an independent analysis separate from the IR office to verify the data point (analytical procedure).
The evaluation plan and matrix are particularly important for understanding the data your evaluator needs every year to create an accurate report. Therefore, being able to read and comprehend the plan is helpful when working with your evaluator.
Shaffer Evaluation Group is a trusted partner in evaluation. This partnership starts at pre-award and continues throughout the grant's life cycle. If you're planning to submit a grant application and require an evaluation partner, please contact us at email@example.com.