What Education Evaluation Services Does BIG Offer?
Education evaluation and research services for projects around the country are offered through the University Montana’s Broader Impacts Group.
Across varied collaborative projects, Dr. Beth Covitt and her team center professional relationships, ethics, responsiveness, accessibility, and high-quality research and evaluation products. Covitt has served as PI, Co-PI, or evaluator on numerous federally funded grants (from Dept. of Ed., EPA, IMLS, NASA, NOAA, NSF, USDA) and on foundation-funded projects (see from various projects).
Efforts address science and environmental education research and program evaluation, science literacy, design-based implementation research, research-practice partnerships, and partnerships with Indigenous communities.
Our team can develop and implement evaluation:
- Plans and methods
- Logic models
- Timelines and scopes of work
- Data collection instruments and data collection
- Data analysis, recommendations, and reporting
Evaluation plans and scopes are crafted in response to the particular needs of diverse projects.
Evaluation services are generally structured as subaward contracts through 老虎机攻略, which provides a variety of services/benefits (e.g., IRB, communications, and evaluation data collection and storage platforms).
For more information contact beth.covitt@umontana.edu
Evaluation and Research Team
Beth Covitt
Head of Science Education Research & Evaluation
Nicollette Frank
Education Evaluation & Research Associate
Annie Caires
Education Research Associate
What Are Common Types of Education Evaluation?
Formative and summative are two big umbrellas in evaluation. Formative evaluation provides information that can support “just-in-time” changes in the ongoing operation and implementation of projects. Summative evaluation focuses on documenting and reporting project products, results, and impacts. Many projects utilize both formative and summative evaluation in concert. Conducting culturally responsive evaluation is also critically important for projects that involve diverse groups of partners, participants, and stakeholders.
Formative Evaluation
- Implementation evaluation examines the process of implementing the project and whether the project is operating as planned.
- Progress evaluation assesses progress in meeting the project’s goals. It involves ongoing collection of information to measure the ways and extent to which benchmarks for progress are being met and providing information for improving effectiveness during the life of the project.
Summative Evaluation
- Outcome or impact evaluation examines to what extent the project achieves its ultimate intended outcomes. Examples of outcome or impact evaluation questions can include things like:
- Did participation lead to an increase in students' expressed interest in studying STEM disciplines in college?
- Did participants demonstrate and/or report increased STEM knowledge and skills?
- Did participants report increased engagement with STEM issues and activities in their communities?
Culturally Responsive Evaluation
- Culturally responsive and collaborative evaluation centers ethical evaluation practices including, for example:
- Systematic co-development with project leaders and engagement with project stakeholders to develop and implement appropriate methods and shared, contextualized, and nuanced understanding of project.
- Centering project community’s ownership and definitions of success in evaluation, which may reflect multiple ways of knowing and valuing.
- Making findings accessible to stakeholders.
Education Evaluation Stories From BIG
Montana Space Grant Consortium Evaluation. The (MTSGC) is led by Dr. Angela Des Jardins and Dr. Meredith Hecker at Montana State University. MTSGC collaborates with colleges and universities across the state to strengthen education in Montana in NASA-related fields. To accomplish its mission, MTSGC offers a variety of innovative and experiential student and faculty STEM engagement and funding opportunities. Evaluation of MTSGC includes formative and summative components. Formative information about how activities are working (e.g., through feedback from activity participants and facilitators) is continuously collected and reported in a timely fashion. This ongoing information helps MTSGC refine the scope of activities they offer and the design of those activities to meet the needs of MTSGC participants and partners. Summative evaluation of MTSGC provides documentation of cumulative impacts, outcomes, and “lessons learned” to support reporting and dissemination purposes. PHOTO: A Montana Space Grant team from Salish Kootenai College launches a radiosonde with a balloon during the October 2023 annular eclipse.
老虎机攻略 BRIDGES Project implemented by Dr. Laurie Yung and Dr. Andrew Wilcox with 老虎机攻略 graduate students. 老虎机攻略 BRIDGES is an NSF-supported Research Trainee (NRT) Project led by Dr. Laurie Yung (Franke College of Forestry) and Dr. Andrew Wilcox (Geosciences). This project trains 老虎机攻略 graduate students as future leaders to advance societally relevant science toward more sustainable food-energy-water systems. Evaluation of this program includes multi-year formative and summative evaluation elements that track the experiences and project outcomes for graduate student cohorts and participating faculty. Evaluation data collection methods include surveys, interviews, and focus groups. Information provided by the evaluation has supported ongoing refinements to the project and is used in required NSF reporting. PHOTO: 老虎机攻略 BRIDGES students visit Grand Coulee Dam during a food, energy, water field trip in June 2018.
Bitterroot Summer of Science, Heman Foundation. For over a decade, spectr老虎机攻略 Discovery Area has partnered with communities in the Bitterroot Valley to deliver STEM programming in settings both within and outside of K-12 schools. Evaluation of the , which is supported in part by the Jane S. Heman Foundation, has been designed to provide both formative and summative findings. Evaluation questions focus on eliciting the experiences and perceptions of partners who make the programs possible and audiences who participate. Combinations of interviews and surveys are used to collect data, and each year a report is written to (1) articulate successes and challenges associated with the collaboration and programming, and (2) provide suggestions for changes, which emerge from evaluation responses. In addition, each year, evaluation questions are updated and revised to realign with the most current issues and questions facing the partnership. A guiding principle is to make sure that voices representing the spectrum of project stakeholders are accurately and adequately represented. PHOTO: Community members create structures and creatures with cardboard and makedo tools at spectr老虎机攻略's Summer of Science at the Darby Public Library.