CEEE Evaluation and Education Research

The CIRES CEEE Evaluation and Educational Research team uses culturally responsive and innovative approaches to evaluate the impact of projects focused on the earth and the environment, and related educational programs. The team designs research studies to examine learning processes and educational outcomes by asking research questions, collecting and interpreting data to answer those questions, and sharing findings with partners through effective reporting and publications.

CEEE evaluators and education researchers bring multidisciplinary backgrounds and training to our partnerships. We work with scientists, educators, and community organizations in formal and informal education settings, with scientific research centers or offices, and with career development programs such as undergraduate research experiences and early career trainings.

Our team has experience in research design; survey development; qualitative, quantitative, and mixed methods studies; social network analysis; interview and focus group facilitation; website and social media analytics; eye-tracking studies; and program activity tracking. Learn more about our work in the tabs below.

Our team currently evaluates several NSF-funded research experiences for undergraduate (REU) programs and course-based undergraduate research experiences (CUREs). We work with program coordinators and instructors from the proposal stage through the final program impact report. Evaluation activities can include customizing formative survey questions for program improvement and refinement; the use of validated survey questions to assess program outcomes, such as science self-efficacy, science identity, and career plans; and qualitative measures, such as focus groups, for a deeper understanding of students’ experiences in a program or course.  

See an example evaluation report from the Research Experiences in Alpine Meteorology (REALM) program:

         REALM Impact Story Cover

Read our recent education research publications about undergraduate research experiences:

  • Okochi, C, Gold, A.U., Christensen, A., Batchelor, R. (2023): Early access to science research opportunities: Growth within a geoscience summer research program for community college students. PLOS ONE. https://doi.org/10.1371/journal.pone.0293674
  • Gold, A. U., Atkins, R., & McNeal, K. S. (2021). Undergraduates graph interpretation and scientific paper reading shift from novice- to expert-like as a result of participation in a summer research experience: A case study. Scholarship and Practice of Undergraduate Research5(2), 7–19. https://doi.org/10.18833/spur/5/2/2.

Our team evaluates curriculum and its impacts on students. We typically evaluate curriculum at a project level, including iterative formative evaluation during the development phase; workshop evaluation on professional development training for teachers using the curriculum; evaluation of student data and teacher implementation data; and web analytics to assess the reach of curriculum use. We have experience with needs assessments in the early stages of a project to determine teacher demands and help project teams deliver successful and relevant curricula to classrooms.  

See an example evaluation report from the Research Experience in Alpine Meteorology (REALM) 

​​​​​​​   REALM Impact Story Cover

Read our recent education research publications about curriculum development:

Our team currently evaluates several informal education projects. Two examples are described below.  

At the University of Colorado Boulder’s Fiske Planetarium, we evaluate video content against funding goals; relationships built through collaborations with video experts; as well as audience impact through surveys. We analyze the reach of the videos through tracking download requests and web analytics. We have also surveyed planetarium managers to gain information about how the videos were used and received by audiences.  

See an example report from the Fiske Explorations project:

        Cover page for Fiske Summative Report

 

For the collaborative We are Water project, we conducted needs assessment and exhibit usability studies during the development phase of the exhibition that is currently traveling to rural and Tribal libraries across the Southwest. As the exhibition travels, we are tracking visitor engagement and using feedback to refine associated community events. also collaborate with an external evaluation team who assess the impact of the project on library and community partnerships via interviews. In addition, our research group is working with library staff and visitors to understand the impact of the project on the learning ecosystems in their communities. 

See an example evaluation report from the We are Water project:

       We are Water Summative Report Slides Cover

Our team provides formative and summative evaluation using collaborative approaches with project teams and partners for data centers and other large, multi-organization projects. Two examples are described here. 

The Environmental Data Science Innovation and Inclusion Lab (ESIIL): ESIIL is an NSF-funded data synthesis center led by CU Boulder, in collaboration with NSF’s CyVerse (University of Arizona) and the University of Oslo. ESIIL enables a global community of environmental data scientists to leverage the wealth of environmental data and emerging analytics to develop science-based solutions to solve pressing challenges in biology and other environmental sciences. CIRES CEEE supports this center through collaborative evaluation with the leadership team to assist with the development of the center and provide feedback on data synthesis working groups, events, and connections within the ESIIL network. The CIRES team contributes skills and expertise in collaborative evaluation approaches, needs assessment, and social network analysis. 

The Navigating the New Arctic Community Office (NNA-CO): NNA-CO is an NSF community program office led by CU Boulder in collaboration with Alaska Pacific University and the University of Alaska Fairbanks. NNA-CO supports the NSF-funded NNA Arctic Research program by building awareness, partnerships, opportunities, and resources for collaboration and equitable knowledge generation through research design and implementation; and coordinates effective knowledge dissemination, education, and outreach. In coordination with the project leadership, CIRES E&O provides a process evaluation of the team, feedback on events including a large annual meeting, tracking methods for project deliverables, as well as an ongoing needs assessment and focus group engagement of the participating researchers and Arctic community members which helped define the goals and activities for the community office and assess the extent of change from those activities. 

See an example evaluation report from NNA-CO:

       NNA-CO Meeting Report Cover

Our team contributes to geoscience education research on topics around spatial reasoning, sense of place, and systems thinking.  

See all of CEEE publications or read about a few of our recent education research publications below:

  • Gold AU, Geraghty Ward EM, Marsh CL, Moon TA, Schoeneman SW, Khan AL, et al. (2023). Measuring novice-expert sense of place for a far-away place: Implications for geoscience instruction. PLoS ONE 18(10): e0293003. https://doi.org/10.1371/journal.pone.0293003
  • Littrell, M. K., Gold, A. U., Kosley, K.L.K., May, T.A., Leckey, E., & Okochi C. (2022). Transformative experience in an informal science learning program about climate change. Journal of Research in Science Teaching, 1-25. https://doi.org/10.1002/tea.21750
  • Gold, A.U., Pendergast. P., Ormand, C., Budd, D., Mueller, K. (2018): Improving Spatial Thinking Skills among Undergraduate Geology Students through short online Training Exercises. International Journal for Science Educationhttps://doi.org/10.1080/09500693.2018.1525621

Learn more about Evaluation

Explore our Evaluation 101 pages and a curated Collection of Evaluation Tools 

Evaluation and Education Research Value Statement

At CIRES CEEE, our evaluation and educational research efforts respect persons, communities, cultures, and identities. We use a variety of methods to match the context and the evaluation question, acknowledging there are multiple ways to conduct research or evaluation. Collaborating, listening, and reflecting are important parts of building relationships, choosing methods, understanding data, and interpreting findings. We see education research and evaluation as part of the process of learning how science education, engagement, and outreach can best support learners, educators, and scientists. 

Illustration of two people high-fiving in a work environment

Get involved and stay up-to-date with CIRES CEEE.