CEEE Evaluation and Education Research
Education Research, Collaborative Program Evaluation, Evaluation Services and Consultations
Education Research, Collaborative Program Evaluation, Evaluation Services and Consultations
The CIRES CEEE Evaluation and Education Research team uses innovative and culturally responsive approaches to evaluate the impact of projects focused on the earth, environment, and related educational programs. The team designs research studies to examine learning processes and educational outcomes by asking research questions, inviting participant perspectives and synthesizing data to answer those questions, and sharing findings with partners through effective reporting and publications.
CEEE evaluators bring multidisciplinary backgrounds and training to our partnerships. We work with scientists, educators, and community organizations in formal and informal education settings, with scientific research centers and offices, and with career development programs such as undergraduate research experiences and early career trainings. This figure illustrates our team’s expertise and topical areas of current projects.
We have experience and expertise in survey development, interview and focus group facilitation, social network analysis, website and social media analytics, and program activity tracking. Learn more about our work in the tabs below.
The CEEE team conducts high-quality research and evaluation that provides data to continuously improve projects and document impacts and outcomes. Our work engages project teams in iterative and reflective processes that support the refinement of project activities and align research and evaluation questions with program goals to meet the needs of project partners and their audiences.
We collaborate on large- and small-scale projects and with diverse groups through all project stages, from needs assessment to summative evaluation. We support project teams in:
Developing evaluation plans and budgets
Writing proposals
Identifying and aligning program objectives
Integrating project goals and activities
Crafting logic models
Securing approval from the Institutional Review Board
Communicating with participants and conducting research and evaluation activities
Analyzing data
Sharing findings through reports, publications, impact stories, graphics, and other means
Our team currently evaluates several NSF-funded Research Experiences for Undergraduates (REU) programs and Course-based Undergraduate Research Experiences (CUREs). We work with program coordinators and instructors from the proposal stage through the final program impact report. Evaluation activities can include custom formative surveys for program improvement and refinement; validated quantitative measures that assess program outcomes such as science self-efficacy, science identity, and career plans; and qualitative measures, such as focus groups and interviews, for a deeper understanding of students’ experiences in a program or course.
See an example evaluation impact story from the Research Experience in Alpine Meteorology (REALM) program, a summer research experience for undergraduate students.
Our team evaluates curricula, teaching resources, and teacher trainings and their impact on teachers and their students. We typically evaluate curricula throughout development and implementation, including iterative formative evaluation during the development phase; evaluation of professional development training for teachers to learn how to implement the curriculum; evaluation of student data and teacher implementation data; and web analytics to assess the reach and use of curricula. We have conducted needs assessments in the early stages of projects to determine teacher demand and to help project teams design and deliver successful and relevant curricula to classrooms.
See an example evaluation report from the HEART Force program, an education program focused on building community resilience that includes a curriculum, student community action project, scenario-based role-play games, teacher training, and a teacher community of practice:
Read our recent education research publications about curriculum development:
Our team evaluates several informal, or outside-of-the-classroom, education projects including planetarium films, library exhibits, community events, and film festivals. We work closely with project teams to conduct needs assessments and exhibit usability studies. We select tailored evaluation techniques when collecting audience feedback, translate evaluation instruments into the primary language of audiences, and use accessible language. We track visitor engagement through touch-screen interactive surveys, hands-on activities integrated into the exhibit, number of downloads of digital resources, web analytics, and participation and attendance numbers. We interview project partners and participants and conduct listening sessions to support the development of the program to meet the audience or community needs and better understand the impact of the project on local learning ecosystems and capacity building in communities.
See an example evaluation report from the We are Water (WaW) project, which includes an exhibition that is currently traveling to rural and Tribal libraries across the Southwest.
Our team provides formative and summative evaluation using collaborative approaches with project teams and partners for data centers, community offices, and other large, multi-organization projects. CEEE provides collaborative evaluation with leadership teams and guidance on developing a logic model and theory of change. We provide process evaluation, impact measures and feedback on events, tracking methods for project deliverables and measures of success, conduct social network analysis to document the evolution of emerging networks, and provide ongoing evaluation to assess the extent of change from those activities. We often contribute to reporting as well as publications about data centers and community offices.
See an example evaluation report from the NSF-funded Environmental Data Science Innovation and Inclusion Lab (ESIIL), a data synthesis center focused on advancing environmental data science and leveraging the wealth of environmental data and innovative analyses to develop science-based solutions to solve pressing challenges in biology and other environmental sciences.
See an example evaluation report from the Navigating the New Arctic Community Office (NNA-CO), which supports all NSF-funded NNA Arctic Research projects and their work by building awareness, partnerships, opportunities, and resources for collaboration and equitable knowledge generation through research design and implementation, and coordinates effective knowledge dissemination, education, and outreach.
Our team contributes to geoscience education research on topics including spatial reasoning, sense of place, environmental action and sense of agency in youth, climate and resilience education, geoscience career development, and systems thinking.
See all CEEE publications or read about a few of our recent education research publications below:
Gold AU, Geraghty Ward EM, Marsh CL, Moon TA, Schoeneman SW, Khan AL, et al. (2023). Measuring novice-expert sense of place for a far-away place: Implications for geoscience instruction. PLoS ONE 18(10): e0293003. https://doi.org/10.1371/journal.pone.0293003
Littrell, M. K., Gold, A. U., Kosley, K.L.K., May, T.A., Leckey, E., & Okochi C. (2022). Transformative experience in an informal science learning program about climate change. Journal of Research in Science Teaching, 1-25. https://doi.org/10.1002/tea.21750
Gold, A.U., Pendergast. P., Ormand, C., Budd, D., Mueller, K. (2018): Improving Spatial Thinking Skills among Undergraduate Geology Students through short online Training Exercises. International Journal for Science Education. https://doi.org/10.1080/09500693.2018.1525621
Semmens, K., Sickler, J., Maxfield, K., Goldner, M., Curry, D., Peddicord, H., … Carr, R. H. (2025). Building Insights Through Observation: Integrating Art and Science to Support Sensemaking. Science Scope, 48(2), 30–38. https://doi.org/10.1080/08872376.2025.2463906
Schloesser, K., Davis, R., Ruffin, T., Gold, A. U., Christensen, A., Littrell, M. K., & Boyd, K. J. (2024). Centering and uplifting youth voice in planning for a more resilient climate future in rural Colorado: A case study of a student resilience team asking for change. Frontiers in Climate, 6(1), Article 1408872. https://doi.org/10.3389/fclim.2024.1408872
Boyd, K. J., Busch, K. C., Gold, A. U., Ward, E. G., Niepold, F., Poppleton, K., … Morrison, D. L. (2024). Using social network analysis to assess connections within climate and energy education organizations: A case study conducted by the Climate Literacy and Energy Awareness Network (CLEAN). Environmental Education Research, 1–24.
Schaecher, A., Rongstad Strong, B., Gold, A. (2023) We are Water Patch Promotes Action for Watershed Stewardship. Connected Science Learning. July – August 2023. 5(4).
Okochi, C, Gold, A.U., Christensen, A., Batchelor, R. (2023): Early access to science research opportunities: Growth within a geoscience summer research program for community college students. PLOS ONE.
Explore our Evaluation 101 pages and a curated Collection of Evaluation Tools
Database of simple evaluation tools that can be used to evaluate programs at low cost. CEEE evaluators co-developed this database and guide.
At CIRES CEEE, our evaluation and educational research efforts respect persons, communities, and cultures. We use a variety of methods to match the context and the evaluation questions, acknowledging there are multiple ways to conduct research or evaluation. Collaborating, listening, and reflecting are important parts of building relationships, choosing methods, understanding data, and interpreting findings. We see evaluation and education research as part of the process of learning how science education, engagement, and outreach can best support learners, educators, and scientists.