Pellegrino (2022) offered a detailed exploration of educational assessment from a learning sciences perspective, providing practical guidance on how assessments can be thoughtfully designed and used in practice. A few times I questioned my decision to include this chapter in my final critical reflection, especially during past 2 weeks as the other four articles I selected focused more directly on teacher education and professional development (PD) within learning sciences. For example, Hora et al. (2021) focused on postsecondary faculty teaching practices, specifically oral communication skills, offering insights into how faculty development programs can enhance these competencies from a sociocultural perspective. Marshall & Horn (2025) investigated how teachers adapt and transform PD practices (“recontextualization”) into “personally meaningful practices” within their specific classroom contexts, emphasizing teacher agency and situated the professional learning. Fishman et al. (2022) provided a comprehensive review of teacher learning and PD, highlighting various theoretical perspectives and mechanisms that support effective PD. Clark et al. (2024) proposed a framework for developing “designerly stances” in pre-service teachers, enabling them to adopt design-field perspectives for instructional design beyond traditional “backward design”. In contrast, Pellegrino (2022) mainly focused on assessment, which at first seems completely different from the other four articles. It addresses how teachers can design and use effective assessments aligned with learning goals, while also showing learning scientists how to effectively incorporate assessment into their research. However, upon closer review, I chose to keep it in my reflection, recognizing its value in strengthening assessment with PD. Despite its distinct focus, it is highly relevant to my research on PD for university continuing education instructors, as it emphasized the importance of aligning assessment with curriculum, pedagogy, and program context, areas where instructors often need targeted support.
Pellegrino (2022)’s chapter opened with a comprehensive exploration of educational assessment from a learning sciences perspective, aiming to guide its effective design and use. In this chapter, Pellegrino (2022) began by acknowledging that “[a]ssessment is one of the most controversial issues in education, with scholars and policy makers arguing about what we assess, how we assess, how often to assess, and the ways in which information derived from assessments is used to shape educational practice” (p.238), which captures the confusion and inconsistency many instructors face when designing assessments, particularly in online or non-traditional learning environments. This also is relevant to my experience working with instructors, many of whom are subject matter experts but may lack formal training in assessment design. Especially during our program renewal of the asynchronous course development process, I often observed that instructors focused primarily on content creation, with limited clarity on what types of assessments to use, how to implement them, and how to ensure alignment with learning goals. Pellegrino’s chapter equips me, as an instructional designer, to unpack these complexities and design PD opportunities that build instructors’ understanding of assessment and provide them guidance on how to clarify the purpose and use of different assessments to support student learning.
The chapter identified three primary functions of assessment: formative assessment, which supports day-to-day instruction and provides feedback to students; summative assessment, which evaluates individual achievement after a learning period; and assessment to evaluate programs and institutions, judging their quality and effectiveness. A key insight Pellegrino (2022) offered is that “No single type of assessment can serve all of the purposes and contexts reviewed above,” (p.240), emphasizing the risk of using a one-size-fits-all approach. This insight is particularly useful for PD training, where instructors often need support in selecting or designing assessments that align with specific course goals. This is especially important in online courses, where assessment design frequently influences students’ online engagement. The assessments, like quizzes or discussion forums, should be used to consider their pedagogical purpose. For instructors in continuing education, Pellegrino’s framework reinforces the idea that assessments should not only evaluate learning outcomes but also support and guide the learning process over time, an area frequently overlooked in practice and well-suited for targeted PD support. By helping instructors understand and apply these distinctions, PD can better support the development of thoughtful, aligned, and meaningful assessment strategies in online learning environments.
This chapter then introduced four conceptual frameworks central to assessment design: assessment as a process of reasoning from evidence, which is foundational and universally shared, often illustrated by the assessment triangle involving “a model of student cognition and learning,” “a set of assumptions and principles about the kinds of observations,” and “an interpretation process”; assessment driven by models of learning, such as learning progressions (p.241), which are “empirically grounded and testable hypotheses about how students’ understanding…grow and become more sophisticated over time” (p.244); the use of an evidence-centered design (ECD) process to develop and interpret assessments; and the centrality of validity, where “an assessment is said to be valid when it is in fact measuring what it claims to be measuring” (p.246). Each of these frameworks offers valuable insights for the PD programs I plan to design for the instructors. They provide clear, research-based tools and shared language for designing assessments that are aligned with both cognitive development and disciplinary goals. This is particularly important in continuing education, where instructors are often experts in their fields but may lack formal training in assessment design. By integrating these principles into PD, I can help bridge the gap between theory and practice, supporting instructors in creating intentional, meaningful assessments that reflect how students learn and what they are expected to achieve.
Furthermore, Pellegrino (2022) also discussed implications for classroom assessment (recommending better integration with curriculum and instruction), large-scale assessment (suggesting a focus on critical learning aspects and broader competencies), and models of measurement (psychometrics), highlighting their traditional limitations in formative assessment and the need for multidisciplinary collaboration. Most importantly, Pellegrino (2022) proposed a vision for balanced assessment systems that are “comprehensive, coherent, and continuous,” (p.252), meaning they coordinate various forms of assessments with instructional design to support learning effectively. This vision directly aligns with the focus of my research, which emphasizes PD that helps instructors design well-aligned courses where content, pedagogy, and assessment are intentionally integrated. Pellegrino’s framework reinforces the importance of rebalancing instructors’ pedagogical knowledge, especially for those teaching online, by equipping them to design assessments that not only measure student learning but also promote engagement and deeper understanding. In online learning environments, where meaningful interaction can be more challenging to sustain, thoughtful assessment design becomes essential. This chapter strengthens the argument that assessment should not be treated as an isolated component, but as a central, pedagogically driven part of course and program design.
Pellegrino (2022)’s chapter is a valuable resource for my research because it expands instructors’ perspectives on assessment and provides a theory-informed foundation for designing PD that goes beyond technical training. It equips me, as an instructional designer, with conceptual tools to help instructors understand why and how assessment should be aligned with learning goals, and how thoughtful assessment design can enhance online student engagement, which are central to my research. Although Pellegrino (2022) ‘s chapter does not focus on PD directly, its frameworks offer essential content knowledge that I can be embedded in PD initiatives. By deepening instructors’ understanding of the purpose and design of assessment, I can support them in rebalancing their Technological Pedagogical Content Knowledge and developing more engaging and effective online learning experiences.
References
Clark, D. B., Scott, D., DiPasquale, J. P., & Becker, S. (2024). Reframing design in education: Proposing a framework to support pre-service teachers in adopting designerly stances. Journal of the Learning Sciences, 33(4–5), 613–666. https://doi.org/10.1080/10508406.2024.2397762
Fishman, B. J., Chan, C. K. K., & Davis, E. A. (2022). Advances in Teacher Learning Research in the Learning Sciences. In R. K. Sawyer (Ed.), The Cambridge Handbook of the Learning Sciences (3rd ed., pp. 619–637). Cambridge University Press. https://doi.org/10.1017/9781108888295.038
Hora, M. T., Benbow, R. J., & Lee, C. (2021). A sociocultural approach to communication instruction: How insights from communication teaching practices can inform faculty development programs. Journal of the Learning Sciences, 30(4–5), 747–796. https://doi.org/10.1080/10508406.2021.1936533
Marshall, S. A., & Horn, I. S. (2025). Teachers as agentic synthesizers: Recontextualizing personally meaningful practices from professional development. Journal of the Learning Sciences, 1–39. https://doi.org/10.1080/10508406.2025.2468230
Pellegrino, J. W. (2022). A Learning Sciences Perspective on the Design and Use of Assessment in Education. In R. K. Sawyer (Ed.), The Cambridge Handbook of the Learning Sciences (3rd ed., pp. 238–258). Cambridge University Press. https://doi.org/10.1017/9781108888295.015
Leave a Reply