top of page

ABOUT US

Effective mentorship plays a crucial role in the success and development of higher education students, particularly in STEM fields. This blog post explores four strategies for training STEM faculty to be effective mentors and discusses the importance of measuring outcomes to ensure the effectiveness of these mentorship programs.



Offer Mentorship Training Programs

To equip STEM faculty with the necessary skills to be effective mentors, institutions of higher education are advised to provide mentorship training programs. These programs can cover topics such as effective communication, building trust, setting goals, and providing constructive feedback. By providing training, institutions ensure that faculty members are equipped with the knowledge and skills to support and guide their mentees effectively.


Establish Mentor-Mentee Guidelines

Clear guidelines and expectations for mentor-mentee relationships help ensure positive mentoring experiences. These guidelines can outline the roles and responsibilities of both mentors and mentees, as well as the frequency and format of meetings. By setting clear expectations, institutions ensure that mentorship relationships are structured and productive. For instance, at the University of Nevada-Las Vegas, guidelines for developing a mentoring program provide guidance to individual faculty or faculty teams.


Create a Supportive Mentorship Network

A supportive mentorship network can provide a space where faculty mentors can connect, share experiences, and seek guidance from their peers. This can be achieved through mentorship circles, faculty mentoring communities, or regular mentorship workshops. For example, at the University of Houston, a monthly mentorship forum is organized where faculty mentors can discuss challenges, share strategies, and learn from each other's experiences. By creating a supportive network, institutions promote continuous learning and improvement among faculty mentors.


Measure Mentoring

Measuring the outcomes of mentorship programs is crucial to assess their effectiveness and make necessary improvements. Some key metrics to consider including:


  • Mentee Satisfaction Surveys: Conducting regular surveys to gather feedback from mentees can provide insights into their satisfaction with the mentorship program. These surveys can assess mentees' perceptions of the support received, the quality of guidance provided, and the overall impact of the mentorship relationship.


  • Mentee Academic Performance: Tracking the academic performance of mentees can assess the impact of mentorship on their success. Comparing the performance of mentees with non-mentored students can provide insights into the effectiveness of the mentorship program in improving academic outcomes.


  • Mentee Retention and Graduation Rates: Monitoring the retention and graduation rates of mentees can indicate the impact of mentorship on their persistence and completion of STEM programs. Higher retention and graduation rates among mentees suggest that mentorship has a positive influence on their academic journey.


Training STEM faculty to be effective mentors is crucial for the success and development of students in STEM fields. By providing mentorship training programs, establishing mentor-mentee guidelines, and creating a supportive mentorship network, institutions can enhance the quality of mentorship provided. Measuring outcomes through mentee satisfaction surveys, academic performance, and retention rates allows institutions to assess the effectiveness of their mentorship programs and make necessary improvements. By continuously investing in mentorship training and evaluation, institutions can ensure that STEM faculty mentors are equipped to support and guide their mentees effectively, ultimately contributing to the success and growth of STEM students.


Interested in working with Shaffer Evaluation Group? Contact us today for a free 30-minute consultation: seg@shafferevaluation.com.

SEG has the pleasure of providing evaluation services for 13 current Department of Defense Education Activity (DoDEA) grant projects. In addition, SEG has been able to support 21 other DoDEA projects that are now completed since SEG’s inception. The DoDEA grant program seeks to support projects in school districts that serve military-connected students (https://dodeagrants.org/). Each year, pending funding availability, a new solicitation is announced around the end of January. This year’s competition has been announced and proposals are due on April 18, 2025. Grants are provided for five-year projects.



SEG has deep experience working with DoDEA projects, including designing evaluation plans that align with DoDEA’s vision on how to measure project outcomes. SEG’s K-12 Project Evaluation Manager, Dr. Stacy Ashworth, presented on this topic at the DoDEA Annual Community of Practice Meeting in November 2024 and wrote about the topic in a recent blog post.


DoDEA currently funds two programs, the Military-Connected Academic Support Program (MCASP), which supports projects school districts deem as needs for their military-connected students, and the World Language Advancement and Readiness Program (WLARP). SEG has experience working with both types of projects and is currently beginning the process of working with some districts on their 2025 proposals.


Shaffer Evaluation Group offers limited pro-bono grant writing services to support grant applications, like DoDEA. For DoDEA projects, we write the evaluation section, prepare the evaluation matrix, and develop the logic model at no cost in exchange for being named as the evaluator in your grant application. SEG would be happy to talk about your ideas and opportunities for collaboration with the current grant competition.

A few years ago, the Department of Defense Education Activity (DoDEA) recently changed how grant goals were written. At the DoDEA Annual Community of Practice Meeting in November 2024, SEG's Dr. Stacy Ashworth delivered an insightful presentation on selecting goal outcome measures to ensure grantee success. This month's blog post recaps Dr. Ashworth's valuable guidance, offering strategies to help DoDEA grantees achieve their objectives.


A Portrait of a Successful Project

We will begin by looking at two projects with STEM goals. The project outcomes are listed as bullets beneath the project name.

Project A

•        Decline in math state test scores from SY2019 to SY2022

•        Decline in average score of military-connected (MC) students on state test score category from SY2019 to SY2022

 

Project B

•        1,400 students (potentially duplicated) participating in extracurriculars throughout the grant period

•        70% of teachers “agreed” or “strongly agreed” participating in STEM activities/using STEM items had improved students’ STEM skills

•        175 computer science and STEM lessons developed

•        1,100 students (potentially duplicated) participating in STEM Camp

 

 

From the outcomes of each project, it is clear that Project B was demonstrated to be more successful than Project A. However, Project A and Project B were the exact same project.


DoDEA recently refined its approach to grant goal formulation, transitioning from narrowly defined goals with a single quantitative indicator to a more comprehensive framework that incorporates multiple indicators. This shift is exemplified by Project A and Project B. In Project A, the evaluation relied on limited indicators, leading to a perception of limited success by Year 4. However, under the revised goal-setting approach in Project B, additional indicators were assessed in Year 5, revealing the project's broader achievements and successful implementation. This case underscores the importance of utilizing diverse and comprehensive outcome measures to accurately capture the full impact of educational initiatives.


Ensuring Your Evaluation Plan Highlights Your Success

As DoDEA grant evaluators, we frequently encounter projects that appear less successful due to two primary challenges:


Data Collection Challenges: Often, districts plan to collect specific data but find it unavailable because the process is burdensome or lacks willing personnel at the school level.

DODEA Grants


Misalignment of Outcome Measures: There's a tendency to over-rely on state test data or other measures that, while valid, don't align with the project's objectives. For instance, if a project's strategy involves providing professional development to enhance math instruction, an aligned outcome would assess teacher math efficacy through surveys or focus groups. Although improvements in student standardized test scores are desirable, numerous factors influence these results, and they may not accurately reflect the project's success.


To address these issues, it's essential to:

  • Select a Balanced Mix of Quantitative and Qualitative Outcome Measures: Ensure that most are closely aligned with the project's goals.

  • Include Broader Outcomes Cautiously: While incorporating less-aligned outcomes, such as standardized test data, can provide additional insights, they should not be the sole indicators of success.


By adopting this comprehensive approach, we create multiple avenues to demonstrate a DoDEA project's success, leading to more accurate evaluations and meaningful educational improvements.


Sample Measures

When selecting outcome measures for your DoDEA grant evaluation, it's essential to consider existing data collection efforts and identify who will be responsible for gathering any new data. Additionally, exploring diverse methods of data analysis and presentation can provide a more comprehensive story of your project's impact. A sample list of quantitative sources and qualitative sources are provided below.


Quantitative Sources and Measures

  • Assessments

  • Surveys

  • Feedback forms

  • Grades

o   Percent of students receiving A/B or D/F

o   Percent of students with D/F that improve to A/B by end of year

o   Improvement from prior year 

  • Course enrollment

o   Enrollment in advanced/optional courses

o   Number of seats filled in [area] courses

o   Number of unique students enrolled in [area] courses

  • Service recipients

o   Number of students receiving [type of service]

o   Number of students receiving [type of service] that no longer need service by end of year 

  • Number of [type] offered

o   Number of opportunities to learn about [program]

o   Number of [type of program] offered

  • Number of participants


Qualitative Sources and Measures

  • Open-ended survey responses/Focus group responses

o   Academic improvement/achievement

*Parent perceptions of student improvement in [topic]

*Teacher perceptions of student improvement or achievement in [topic]

*Student perceptions of [topic] achievement

o   Professional growth

*Teacher perceptions in ability to implement [topic]

o   Program impact

*Stakeholder perceptions of impact of program innovations


Planning to write an application for a DoDEA grant and looking for an evaluation partner? SEG will assist you with formulating goals and provide you with an evaluation matrix at the grant application stage at no cost to your institution in exchange for being listed in your application as the external evaluator. Contact us today for a free 30-minute consultation: seg@shafferevaluation.com.



Copperas Cove Independent School District (TX) Graduates
Copperas Cove Independent School District (TX) Graduates

Anchor 1

Shaffer Evaluation Group, 1311 Jamestown Road, Suite 101, Williamsburg, VA 23185   833.650.3825

Contact Us     © Shaffer Evaluation Group LLC 2020 

bottom of page