top of page

ABOUT US

By Stacy Hayden, M.A., SEG Research Associate


Evaluator-prepared reports according to one recently interviewed project director have utility beyond meeting a funder’s requirements as they can inform future grant writing and current grant cycle activities. Report recipients can share sections of the report with others (e.g., workgroup, committee, project team) and use specific questions to explore the findings and plan for the future or use the evaluator-prepared report as a source for completing other reports. Six commonly used evaluation report sections are highlighted below and with an explanation of how each may be useful to review with the project team and other key constituents along with list of guiding questions.


1. Executive Summary: The most commonly shared section of longer reports, the executive summary is especially effective for providing leadership with a high-level summary of a project. Many project directors use them as a handout or as bullets in a slide deck. In this application, questions that may be asked include:

  • Given the evaluation report findings, what resonates with you?

  • What opportunities or connections do you see with the work of this project to other initiatives within our organization (e.g., college, school system)?


2. Fidelity of Implementation Section/Table: Many reports include fidelity of implementation data in a table. Fidelity of Implementation assesses to what extent the project is being implemented in accordance with the proposed plan. When reviewing indicators in a fidelity of implementation section some questions you may wish to ask are:

  • Based on the indicators that are met, what are we doing well?

  • What strategies helped us to meet this indicator?

  • Based on the indicators that are not met, what do we still need to work on?

  • Why did we not meet this indicator (if applicable)? What steps would we need to take to meet this indicator in the future?


3. Process Monitoring Section/Table: Some reports will also include a section on process monitoring. Process monitoring provides information to help improve the project over time. Often, this information comes from participants (e.g., students, teachers/faculty, staff). When reviewing indicators or questions in a process monitoring section some questions you may wish to ask are:

  • What do interested parties/participants (e.g., students, community members, teachers, administrators) think is going well? How can we continue to improve or expand on what is going well?

  • What do interested parties/participants think needs improvement? What action steps should be taken to address these concerns?

  • Based on feedback, what aspects of the project might be best/feasible to sustain beyond the period of grant funding? What steps would need to be taken to achieve sustainability?


4. Survey Results: Many evaluations also involve surveys or feedback forms. Reviewing this data can be quite informative. Sometimes, specific data points are selected and presented throughout the report. When this happens, the project director can curate these statistics for review. In other cases, tables or graphs may be presented in the report that can be used.

  • What surprises you about that data? In other words, what do you notice as outliers (i.e., higher or lower). Why do you think it is higher or lower?

  • What does the data mean for the project? What should be sustained? What may need to be changed?


5. Sustainability: In many reports, the evaluator is asked to discuss what elements of the project are emerging as sustainable. There are seven key domains of sustainability (described in our June 2022 post). As you explore this section, note what areas the evaluator identified as sustainable for your project. Consider:

  • What domains of sustainability are not represented in this section?

  • What aspects of this project could emerge as sustainable? What action steps are needed to take to accomplish this?


6. Recommendations: Finally, most reports include a recommendations/conclusions section. This section may be written in a bulleted list, which makes it easier for sharing with a work group or committee. As you discuss each recommendation you may wish to explore the following questions:

  • Which recommendations are most feasible to address at this time?

  • Which recommendations may need a longer timeframe and are worthy of addressing?

  • What action steps need to be taken to address this recommendation?

  • If a recommendation cannot feasibly be addressed, what is an alternate action that can be taken to partly address the recommendation?

By Courtney Hagan, Ph.D., SEG Research Associate


Evaluation requires data analysis to ensure that a project or program is making progress on its stated objectives and goals (e.g., increasing retention by 2% every year). Evaluators will request data from project or program teams on a regular basis to help ensure that these data points are included in evaluation reporting. Four tips to help ensure the proper transfer of data when working with your evaluation team are:


1. Make a data sharing plan. As stated in our October 2022 blog, “Maintaining a Project Documentation Archive”, setting up a secure shared folder is important for sharing information with your evaluator; this includes sharing data as well.


2. Involve your data people. Share the evaluation plan and data needs early with the data people in your organization (e.g., institutional research) so that they can provide insight on what they can readily provide, ask questions, and offer refinements. It can be helpful to e-introduce the evaluator and your institutional data point-of-contact.


3. Review your data request (aka "data call"). Your evaluator will request specific data items from your team. These data items align with the evaluation matrix that was created based on your project or program design. Often these data items are accompanied by definitions, since terms such as "student persistence" or "student engagement" can be defined differently on each campus. Only data that is requested needs to be uploaded in the shared folder.


4. Ensure your data is accurate, clean, and interpretable. Sometimes the data calls can come at inopportune and busy times; however, it is important to ensure the data is accurate, clean, and interpretable to third parties to facilitate data analysis by the external evaluator. Further, your evaluation team may request individual-level data (instead of aggregate), to ensure that the team can make the calculations on their own (as an objective third party). Because this data is requested at the individual-level, it is advisable to de-identify the data by using student ID numbers instead of student names.


As always, every evaluation project is unique. Therefore, it is best practice to consult with your evaluation team to make the best determination with how and in what ways you should be sharing your data.


Related post: In our October 2022 blog, we discussed sharing project documentation with your evaluator (Maintaining a Project Documentation Archive).

In last month’s blog—So Your Project Got Funded…Now What?—we reviewed steps project directors should take to support successful implementation once notice is received that your project has been funded. One of those key steps is to keep an implementation log, which provides a simple way to track grant activities (e.g., meetings, emails, programs), including who was involved and what resulted from the activities. Implementation logs were also discussed in our July 2022 blog post.


Professional development calendars and event sign-in sheets, meeting agendas and minutes, course rosters, event flyers, and other project documentation may be linked by project directors to implementation log entries. These artifacts provide evidence that project activities took place and provide insight on how a project was implemented.

Well organized data enhances the project director’s ease in providing information for the project evaluator. Five tips on how to maintain a project documentation archive appear below using a fictional project called MIST (Motivating Individuals in Science and Technology).


1. Set up a shared folder with your evaluator. Consult with your evaluator about selecting a cloud-based file platform that both parties can securely use. Google Drive is a popular option for many school districts and higher education institutions, but there are plenty of other options, including Box, Dropbox, and OneDrive. If you are transferring sensitive information, such as personally identifiable information, make sure the system you select is secure.


2. Organize sub-folders and files within the shared folder using a structure that you can maintain such as by activity, goal, strategy, reporting timeframe, distinct group (e.g., student, staff, faculty), etc. For MIST, there was an afterschool student club component, grade-level field trip activity, and teacher training. For MIST, an appropriate folder structure included three sub-folders (one for each main area) and a fourth sub-folder with general information such as the budget, grant application, and reporting requirements. Within each sub-folder, add files and additional folders as appropriate.


3. Name files for clarity: consistently name files with the project, what it is, and date. For example, a participant sign-in sheet for the MIST professional development may be “MIST_PD_Sign-in_Fall_2022.”


4. Leave breadcrumbs. Some files are maintained in other places such as an online survey tool, YouTube, or shared from a colleague’s Google Drive. A tip is to have a Word document or Excel spreadsheets with the links to where other documentation sources are found.


5. Annotate your documentation as needed. When something goes great, awry, or perhaps somewhere in between, consider leaving an electronic note on the documentation to remind you later of what occurred. Using the MIST afterschool club as an example, perhaps attendance was down due to the flu, students had great quotes about an activity, or there was tweaking of the initial schedule to better align with student needs and interests that emerged.


Not all project documentation needs to be kept. Your project evaluator is best positioned to provide you guidance on which project documentation you should maintain based on your project’s evaluation plan.

Anchor 1

Shaffer Evaluation Group, 1311 Jamestown Road, Suite 101, Williamsburg, VA 23185   833.650.3825

Contact Us     © Shaffer Evaluation Group LLC 2020 

bottom of page