top of page

ABOUT US

A few years ago, the Department of Defense Education Activity (DoDEA) recently changed how grant goals were written. At the DoDEA Annual Community of Practice Meeting in November 2024, SEG's Dr. Stacy Ashworth delivered an insightful presentation on selecting goal outcome measures to ensure grantee success. This month's blog post recaps Dr. Ashworth's valuable guidance, offering strategies to help DoDEA grantees achieve their objectives.


A Portrait of a Successful Project

We will begin by looking at two projects with STEM goals. The project outcomes are listed as bullets beneath the project name.

Project A

•        Decline in math state test scores from SY2019 to SY2022

•        Decline in average score of military-connected (MC) students on state test score category from SY2019 to SY2022

 

Project B

•        1,400 students (potentially duplicated) participating in extracurriculars throughout the grant period

•        70% of teachers “agreed” or “strongly agreed” participating in STEM activities/using STEM items had improved students’ STEM skills

•        175 computer science and STEM lessons developed

•        1,100 students (potentially duplicated) participating in STEM Camp

 

 

From the outcomes of each project, it is clear that Project B was demonstrated to be more successful than Project A. However, Project A and Project B were the exact same project.


DoDEA recently refined its approach to grant goal formulation, transitioning from narrowly defined goals with a single quantitative indicator to a more comprehensive framework that incorporates multiple indicators. This shift is exemplified by Project A and Project B. In Project A, the evaluation relied on limited indicators, leading to a perception of limited success by Year 4. However, under the revised goal-setting approach in Project B, additional indicators were assessed in Year 5, revealing the project's broader achievements and successful implementation. This case underscores the importance of utilizing diverse and comprehensive outcome measures to accurately capture the full impact of educational initiatives.


Ensuring Your Evaluation Plan Highlights Your Success

As DoDEA grant evaluators, we frequently encounter projects that appear less successful due to two primary challenges:


Data Collection Challenges: Often, districts plan to collect specific data but find it unavailable because the process is burdensome or lacks willing personnel at the school level.

DODEA Grants


Misalignment of Outcome Measures: There's a tendency to over-rely on state test data or other measures that, while valid, don't align with the project's objectives. For instance, if a project's strategy involves providing professional development to enhance math instruction, an aligned outcome would assess teacher math efficacy through surveys or focus groups. Although improvements in student standardized test scores are desirable, numerous factors influence these results, and they may not accurately reflect the project's success.


To address these issues, it's essential to:

  • Select a Balanced Mix of Quantitative and Qualitative Outcome Measures: Ensure that most are closely aligned with the project's goals.

  • Include Broader Outcomes Cautiously: While incorporating less-aligned outcomes, such as standardized test data, can provide additional insights, they should not be the sole indicators of success.


By adopting this comprehensive approach, we create multiple avenues to demonstrate a DoDEA project's success, leading to more accurate evaluations and meaningful educational improvements.


Sample Measures

When selecting outcome measures for your DoDEA grant evaluation, it's essential to consider existing data collection efforts and identify who will be responsible for gathering any new data. Additionally, exploring diverse methods of data analysis and presentation can provide a more comprehensive story of your project's impact. A sample list of quantitative sources and qualitative sources are provided below.


Quantitative Sources and Measures

  • Assessments

  • Surveys

  • Feedback forms

  • Grades

o   Percent of students receiving A/B or D/F

o   Percent of students with D/F that improve to A/B by end of year

o   Improvement from prior year 

  • Course enrollment

o   Enrollment in advanced/optional courses

o   Number of seats filled in [area] courses

o   Number of unique students enrolled in [area] courses

  • Service recipients

o   Number of students receiving [type of service]

o   Number of students receiving [type of service] that no longer need service by end of year 

  • Number of [type] offered

o   Number of opportunities to learn about [program]

o   Number of [type of program] offered

  • Number of participants


Qualitative Sources and Measures

  • Open-ended survey responses/Focus group responses

o   Academic improvement/achievement

*Parent perceptions of student improvement in [topic]

*Teacher perceptions of student improvement or achievement in [topic]

*Student perceptions of [topic] achievement

o   Professional growth

*Teacher perceptions in ability to implement [topic]

o   Program impact

*Stakeholder perceptions of impact of program innovations


Planning to write an application for a DoDEA grant and looking for an evaluation partner? SEG will assist you with formulating goals and provide you with an evaluation matrix at the grant application stage at no cost to your institution in exchange for being listed in your application as the external evaluator. Contact us today for a free 30-minute consultation: seg@shafferevaluation.com.



Copperas Cove Independent School District (TX) Graduates

Food and energy sovereignty are foundational to the self-determination and well-being of Indigenous communities. Food sovereignty refers to the right of peoples to healthy and culturally appropriate food produced through ecologically sound and sustainable methods, and their right to define their own food and agricultural systems. Energy sovereignty extends this principle to the realm of energy resources, where communities have the authority to produce, distribute, and use energy in ways that align with their values and environmental stewardship. Evaluating these initiatives requires approaches that center Indigenous voices, respect traditional knowledge systems, and embrace holistic methodologies.


Culturally Responsive Evaluation Frameworks

Culturally responsive evaluation (CRE) is a critical approach for assessing Indigenous food and energy sovereignty initiatives. CRE is a holistic framework for centering evaluation in culture (Frierson, Hood, Hughes, and Thomas, 2010). In evaluating Indigenous food and energy sovereignty initiatives, evaluators must steep themselves in a community's values, traditions, and aspirations. For example, measures of success might prioritize the restoration of native plant species, intergenerational transfer of agricultural knowledge, or reduced reliance on external energy grids—goals that standard Western evaluation metrics may overlook.


Community-based participatory research (CBPR) aligns well with CRE, as it involves co-creating evaluation designs with Indigenous communities. The W.K. Kellogg Foundation (2001) defined community-based participatory research as “a collaborative approach to research that equitably involves all partners in the research process and recognizes the unique strengths that each brings.” (p. 2). Participatory approaches such as CBPR are essential for fostering trust and avoiding extractive research practices.


Decolonizing Data Practices

Evaluators must also address the power dynamics inherent in data collection and interpretation. Linda Tuhiwai Smith (2012) emphasized the importance of decolonizing research methods to avoid perpetuating systems of oppression. In practice, this means prioritizing Indigenous data sovereignty—ensuring that data is controlled and interpreted by the community it represents. The First Nations Information Governance Centre’s principles of OCAP® (Ownership, Control, Access, and Possession) offer a model for ethical data practices, asserting that indigenous peoples should have control over data collection processes and should own and control how this information is used.


Integrating Holistic Practices and Indicators

Evaluation of Indigenous food and energy sovereignty should take into account the interconnectedness of ecological, social, cultural, and economic dimensions. The AIHEC evaluation framework (2008) proposes a set of evaluation practices based on the core Indigenous values of (a) being a people of a place, (b) recognizing our gifts, (c) honouring family and community, and (d) respecting sovereignty. Situating a program by describing its relationship to the community, including its history and current situation, and understanding that food and energy sovereignty indicators may focus on more holistic measures, such as community health and well-being, are examples of holistic evaluation practices that matter to Indigenous communities. An example of an energy sovereignty evaluation could include metrics like reduced carbon footprints alongside community-defined indicators, such as revitalization of traditional ecological knowledge.


Conclusion

Evaluating Indigenous food and energy sovereignty requires a shift from conventional evaluation methodologies to those that prioritize Indigenous worldviews and self-determination. By employing culturally responsive, decolonizing, and holistic approaches, evaluators can not only measure outcomes but also contribute to the empowerment and resilience of Indigenous communities. As program evaluators, we have a responsibility to uplift Indigenous voices and ensure that evaluation processes align with the goals and values of the communities we serve.

In higher education, the concept of "town and gown" is often used to describe the relationship between a college or university (the gown, symbolizing academic life) and the surrounding community or city (town). Historically, this relationship has been complex, with higher education institutions sometimes seen as isolated or separate entities from their local communities. In modern higher education strategic planning, however, the concept plays a significant role in shaping institutional priorities and fostering collaboration between the college and its community for mutual benefit.


By setting clear objectives for community consultation, engaging diverse stakeholders, employing mixed methods, and creating transparent feedback loops, colleges can cultivate an environment where community voices shape the path forward. This not only strengthens the plan itself but also deepens trust and commitment across the town and gown divide, laying the groundwork for successful plan implementation.


Here’s a breakdown of an approach to community consultation that ensures an inclusive, transparent, and actionable strategic plan.


1. Setting Clear Objectives

The first step in effective community consultation is defining its purpose and scope. Is the college looking for input on academic program offerings, new or enhanced community partnerships, or donor development? The clarity of purpose not only shapes the tools and techniques we choose but also sets expectations for stakeholder engagement. Our team collaborates closely with college leadership to outline these goals, ensuring alignment with the institution’s broader mission and strategic priorities.


2. Identifying Key Stakeholders

Colleges are inherently complex ecosystems with a range of stakeholders, each bringing unique perspectives and priorities. A stakeholder list for a community college usually includes students, faculty, staff, administration, K-12 partners, local industry representatives, and community organizations. Each group’s input is essential for building a well-rounded strategic plan.


The importance of inclusive engagement cannot be overstated. Students can highlight barriers to academic success, while faculty and staff offer insights into operational and pedagogical challenges. Industry partners and K-12 school districts provide valuable perspectives on skills alignment and pipeline development. By reaching out to these groups, we lay the foundation for a plan that reflects the entire community’s needs.


3. Designing the Engagement Strategy

A robust consultation plan leverages a mix of qualitative and quantitative methods. Surveys provide breadth, capturing a wide array of viewpoints with data that can be analyzed for patterns and trends. Focus groups, on the other hand, offer depth, allowing for nuanced discussions that unearth issues that might be missed in a survey format. Town halls offer another alternative, providing an open forum where all voices are heard in real time, facilitating dialogue and fostering a sense of shared purpose.


4. Creating a Conducive Environment for Dialogue

Successful consultations rely on trust and a genuine invitation to participate. In facilitating focus groups and town halls, we create a neutral "safe space" where participants feel they can share openly without fear of reprisal. Stakeholders also must feel their input is valued and impactful. This means managing sessions where quieter voices are encouraged to share, dominant speakers are balanced, and responses are free from judgment. In town halls, interactive technology, such as live polling tools, encourages real-time feedback and maintains engagement. This approach can help bridge the gap between in-person attendees and virtual participants, ensuring that everyone can participate regardless of location.


5. Analyzing and Synthesizing Feedback

The consultation process doesn’t end with gathering data—it extends to thorough analysis and synthesis. Quantitative survey results are segmented by stakeholder group to identify trends, and qualitative data from focus groups, town halls, and open-ended survey questions are coded for themes and key insights. This dual approach reveals where different stakeholder priorities align or diverge, guiding decision-making. Feedback might show, for example, that students and faculty both emphasize the need for enhanced support services, whereas industry partners prioritize updates to technical training programs. Such insights allow the college to develop targeted strategic initiatives that cater to these needs without compromising overall institutional goals.


6. Communicating Findings

Transparency is key to maintaining trust. After synthesizing feedback, we recommend that findings are communicated back to stakeholders in an accessible format, whether through reports, presentations, or online dashboards. Highlighting how stakeholder input has been used to shape strategic directions reassures participants that their involvement has made an impact.


7. Integrating Consultation into Plan Development

Community consultation shouldn’t be a one-time event but rather an embedded practice within the institution’s planning cycle. Using a planning committee composed of diverse stakeholder representatives, for example, is another method to ensure that the college's strategic plan is responsive to community needs. We also advocate for mechanisms that allow colleges to continuously engage with stakeholders, adjusting their strategies as the landscape evolves. This dynamic approach ensures that the college remains responsive and proactive in fulfilling its mission.


Final Thoughts

A college can bridge the town and gown divide by embedding community consultation and involvement throughout its strategic planning cycle. This approach not only strengthens the plan itself but deepens trust and commitment across the community, laying the groundwork for successful institutional growth.


Our sister firm, Gaston-Shaffer, specializes in community-driven strategic planning services. Contact them today to learn how they can enhance your strategic planning processes.


A group of people are seated in a circle participating in a focus group discussion.
Focus group

Anchor 1
bottom of page