top of page

ABOUT US

SEG has the pleasure of providing evaluation services for 13 current Department of Defense Education Activity (DoDEA) grant projects. In addition, SEG has been able to support 21 other DoDEA projects that are now completed since SEG’s inception. The DoDEA grant program seeks to support projects in school districts that serve military-connected students (https://dodeagrants.org/). Each year, pending funding availability, a new solicitation is announced around the end of January. This year’s competition has been announced and proposals are due on April 18, 2025. Grants are provided for five-year projects.



SEG has deep experience working with DoDEA projects, including designing evaluation plans that align with DoDEA’s vision on how to measure project outcomes. SEG’s K-12 Project Evaluation Manager, Dr. Stacy Ashworth, presented on this topic at the DoDEA Annual Community of Practice Meeting in November 2024 and wrote about the topic in a recent blog post.


DoDEA currently funds two programs, the Military-Connected Academic Support Program (MCASP), which supports projects school districts deem as needs for their military-connected students, and the World Language Advancement and Readiness Program (WLARP). SEG has experience working with both types of projects and is currently beginning the process of working with some districts on their 2025 proposals.


Shaffer Evaluation Group offers limited pro-bono grant writing services to support grant applications, like DoDEA. For DoDEA projects, we write the evaluation section, prepare the evaluation matrix, and develop the logic model at no cost in exchange for being named as the evaluator in your grant application. SEG would be happy to talk about your ideas and opportunities for collaboration with the current grant competition.

A few years ago, the Department of Defense Education Activity (DoDEA) recently changed how grant goals were written. At the DoDEA Annual Community of Practice Meeting in November 2024, SEG's Dr. Stacy Ashworth delivered an insightful presentation on selecting goal outcome measures to ensure grantee success. This month's blog post recaps Dr. Ashworth's valuable guidance, offering strategies to help DoDEA grantees achieve their objectives.


A Portrait of a Successful Project

We will begin by looking at two projects with STEM goals. The project outcomes are listed as bullets beneath the project name.

Project A

•        Decline in math state test scores from SY2019 to SY2022

•        Decline in average score of military-connected (MC) students on state test score category from SY2019 to SY2022

 

Project B

•        1,400 students (potentially duplicated) participating in extracurriculars throughout the grant period

•        70% of teachers “agreed” or “strongly agreed” participating in STEM activities/using STEM items had improved students’ STEM skills

•        175 computer science and STEM lessons developed

•        1,100 students (potentially duplicated) participating in STEM Camp

 

 

From the outcomes of each project, it is clear that Project B was demonstrated to be more successful than Project A. However, Project A and Project B were the exact same project.


DoDEA recently refined its approach to grant goal formulation, transitioning from narrowly defined goals with a single quantitative indicator to a more comprehensive framework that incorporates multiple indicators. This shift is exemplified by Project A and Project B. In Project A, the evaluation relied on limited indicators, leading to a perception of limited success by Year 4. However, under the revised goal-setting approach in Project B, additional indicators were assessed in Year 5, revealing the project's broader achievements and successful implementation. This case underscores the importance of utilizing diverse and comprehensive outcome measures to accurately capture the full impact of educational initiatives.


Ensuring Your Evaluation Plan Highlights Your Success

As DoDEA grant evaluators, we frequently encounter projects that appear less successful due to two primary challenges:


Data Collection Challenges: Often, districts plan to collect specific data but find it unavailable because the process is burdensome or lacks willing personnel at the school level.

DODEA Grants


Misalignment of Outcome Measures: There's a tendency to over-rely on state test data or other measures that, while valid, don't align with the project's objectives. For instance, if a project's strategy involves providing professional development to enhance math instruction, an aligned outcome would assess teacher math efficacy through surveys or focus groups. Although improvements in student standardized test scores are desirable, numerous factors influence these results, and they may not accurately reflect the project's success.


To address these issues, it's essential to:

  • Select a Balanced Mix of Quantitative and Qualitative Outcome Measures: Ensure that most are closely aligned with the project's goals.

  • Include Broader Outcomes Cautiously: While incorporating less-aligned outcomes, such as standardized test data, can provide additional insights, they should not be the sole indicators of success.


By adopting this comprehensive approach, we create multiple avenues to demonstrate a DoDEA project's success, leading to more accurate evaluations and meaningful educational improvements.


Sample Measures

When selecting outcome measures for your DoDEA grant evaluation, it's essential to consider existing data collection efforts and identify who will be responsible for gathering any new data. Additionally, exploring diverse methods of data analysis and presentation can provide a more comprehensive story of your project's impact. A sample list of quantitative sources and qualitative sources are provided below.


Quantitative Sources and Measures

  • Assessments

  • Surveys

  • Feedback forms

  • Grades

o   Percent of students receiving A/B or D/F

o   Percent of students with D/F that improve to A/B by end of year

o   Improvement from prior year 

  • Course enrollment

o   Enrollment in advanced/optional courses

o   Number of seats filled in [area] courses

o   Number of unique students enrolled in [area] courses

  • Service recipients

o   Number of students receiving [type of service]

o   Number of students receiving [type of service] that no longer need service by end of year 

  • Number of [type] offered

o   Number of opportunities to learn about [program]

o   Number of [type of program] offered

  • Number of participants


Qualitative Sources and Measures

  • Open-ended survey responses/Focus group responses

o   Academic improvement/achievement

*Parent perceptions of student improvement in [topic]

*Teacher perceptions of student improvement or achievement in [topic]

*Student perceptions of [topic] achievement

o   Professional growth

*Teacher perceptions in ability to implement [topic]

o   Program impact

*Stakeholder perceptions of impact of program innovations


Planning to write an application for a DoDEA grant and looking for an evaluation partner? SEG will assist you with formulating goals and provide you with an evaluation matrix at the grant application stage at no cost to your institution in exchange for being listed in your application as the external evaluator. Contact us today for a free 30-minute consultation: seg@shafferevaluation.com.



Copperas Cove Independent School District (TX) Graduates
Copperas Cove Independent School District (TX) Graduates

Food and energy sovereignty are foundational to the self-determination and well-being of Indigenous communities. Food sovereignty refers to the right of peoples to healthy and culturally appropriate food produced through ecologically sound and sustainable methods, and their right to define their own food and agricultural systems. Energy sovereignty extends this principle to the realm of energy resources, where communities have the authority to produce, distribute, and use energy in ways that align with their values and environmental stewardship. Evaluating these initiatives requires approaches that center Indigenous voices, respect traditional knowledge systems, and embrace holistic methodologies.


Culturally Responsive Evaluation Frameworks

Culturally responsive evaluation (CRE) is a critical approach for assessing Indigenous food and energy sovereignty initiatives. CRE is a holistic framework for centering evaluation in culture (Frierson, Hood, Hughes, and Thomas, 2010). In evaluating Indigenous food and energy sovereignty initiatives, evaluators must steep themselves in a community's values, traditions, and aspirations. For example, measures of success might prioritize the restoration of native plant species, intergenerational transfer of agricultural knowledge, or reduced reliance on external energy grids—goals that standard Western evaluation metrics may overlook.


Community-based participatory research (CBPR) aligns well with CRE, as it involves co-creating evaluation designs with Indigenous communities. The W.K. Kellogg Foundation (2001) defined community-based participatory research as “a collaborative approach to research that equitably involves all partners in the research process and recognizes the unique strengths that each brings.” (p. 2). Participatory approaches such as CBPR are essential for fostering trust and avoiding extractive research practices.


Decolonizing Data Practices

Evaluators must also address the power dynamics inherent in data collection and interpretation. Linda Tuhiwai Smith (2012) emphasized the importance of decolonizing research methods to avoid perpetuating systems of oppression. In practice, this means prioritizing Indigenous data sovereignty—ensuring that data is controlled and interpreted by the community it represents. The First Nations Information Governance Centre’s principles of OCAP® (Ownership, Control, Access, and Possession) offer a model for ethical data practices, asserting that indigenous peoples should have control over data collection processes and should own and control how this information is used.


Integrating Holistic Practices and Indicators

Evaluation of Indigenous food and energy sovereignty should take into account the interconnectedness of ecological, social, cultural, and economic dimensions. The AIHEC evaluation framework (2008) proposes a set of evaluation practices based on the core Indigenous values of (a) being a people of a place, (b) recognizing our gifts, (c) honouring family and community, and (d) respecting sovereignty. Situating a program by describing its relationship to the community, including its history and current situation, and understanding that food and energy sovereignty indicators may focus on more holistic measures, such as community health and well-being, are examples of holistic evaluation practices that matter to Indigenous communities. An example of an energy sovereignty evaluation could include metrics like reduced carbon footprints alongside community-defined indicators, such as revitalization of traditional ecological knowledge.


Conclusion

Evaluating Indigenous food and energy sovereignty requires a shift from conventional evaluation methodologies to those that prioritize Indigenous worldviews and self-determination. By employing culturally responsive, decolonizing, and holistic approaches, evaluators can not only measure outcomes but also contribute to the empowerment and resilience of Indigenous communities. As program evaluators, we have a responsibility to uplift Indigenous voices and ensure that evaluation processes align with the goals and values of the communities we serve.

Anchor 1

Shaffer Evaluation Group, 1311 Jamestown Road, Suite 101, Williamsburg, VA 23185   833.650.3825

Contact Us     © Shaffer Evaluation Group LLC 2020 

bottom of page