We spend countless hours and large sums of money focusing on and providing professional development with the goal of improving learner outcomes and profoundly impacting change, however, we often overlook the critical need to provide the same dedication to continuously evaluating its effectiveness.
One study estimated that in the 2007/08 school year, Philadelphia School District spent almost $162 million on professional learning, which included training for teachers as well as release time for teachers and coaches (ERS, 2013).
– REL Northeast & Islands
This is just one set of data from the many out there. The need for authentic ongoing evaluation is crucial not only to ensure that money is meaningfully spent, but very importantly to determine if our students and educators are being provided with enduring, effective, and excellent learning experiences. Furthermore, ongoing evaluation identifies areas that need to be adjusted and refined and determines if professional development is meaningfully aligned with goals.
- So, how do we engage in evaluation that is ongoing and meaningful, and filled with more walk-the-talk than talk-the-talk actions and data?
- How do we know our strategies, models, coaching techniques and cycles, workshops, courses, guides, articles, and more are resulting in enduring, meaningful, innovative and off-the-charts changes and learning?
Gathering and analyzing qualitative data provides us with the answers to these and other similar questions. This is not to say that we do not focus on quantitative data, but too often this is the only data set used to determine effectiveness. Looking at test scores or numbers alone does not authentically gauge students’ and educators’ depth of knowledge, creativity, innovation, intrinsic motivation, and attitudes. Therefore, to evaluate effectiveness, we should examine 80% qualitative data and 20% quantitative data.
This SAGE Evaluating Professional Development framework explores four important focus areas for evaluation: student learning outcomes, application of participants’ knowledge, gateway elements of organizational change, and enduring empowerment and excellence. It is guided by research and includes synthesized and blended ideas and questions from the work of Rick DuFour, Thomas Guskey and Kirkpatrick’s evaluation model which focuses on evaluating reactions, learning, behavior, and results.
Student Learning Outcomes
In this framework, the element of Student Learning Outcomes was placed first because the primary focus of PD evaluation should be on the most important people in education, our students. In using the 80-20 data analysis rule, we must examine student learning products and portfolios in conjunction with test results. A meaningful evaluation practice is to engage students with the task of gathering, analyzing, highlighting, and presenting their data to “defend their products and techniques” based on the new PD strategies implemented. Let’s not wait for them to get to a Ph.D. level in education to do this.
Evaluation Questions
- Am I a more confident, motivated, dedicated, autonomous, and passionate learner because of this change?
- Did my intrinsic motivation increase because of this new strategy? Did this strategy address my learning styles and interests?
- Did this change improve my creativity, depth of knowledge, and learning products?
- What new and revised learning products and outcomes were generated because of this change?
- Did I acquire growth because of this change? How do I know?
- Did this change make learning more applicable to real-world situations and my communities?
- Did this change help me develop future workplace skills of innovation, resilience, and adaptability?
Data Collection Methods
Student Portfolios
Interviews
Questionnaires
Self-reflection
Self-evaluation
Community-Based Learning
Project Based Learning
Student Presentation
Assessment Results
Application of Participants’ Knowledge
Knowledge is not power until it’s applied.
— Dale Carnegie, American writer and lecturer
Along with assessing participants’ reactions and knowledge, how do we authentically evaluate applicants’ implementation of ideated ideas and strategies gained from PD over time? According to Calpro, “It is unreasonable to expect that individual professional development activities will immediately result in altered long-term instructional behavior, improved learner performance, or changed organizational structures and practices.” Therefore, focus must be placed on using various measuring techniques and several measurement intervals. Evaluation of the application of participants’ knowledge is only meaningful when we see how strategies gained powerfully change students’ and educators’ learning trajectories over time. Teachers and educators must also be allowed to build portfolios and collect, analyze, and share evaluative data.
Evaluation Methods
- Is the information gained useful and does it meet my learning goals, adult learning styles, and my students’ needs?
- How will I use and implement ideas gained and generated to achieve organizational and personal learning goals?
- How will I know if my ideated products are effectively applied and authentically implemented?
- How will I know if my learners and I are functioning at a high level of excellence?
- What do I want students/learners to show and do because of this change?
- How has student achievement and motivation changed? How do you know?
- Was there a genuine focus on aligned, authentic, and relevant learning?
Data Collection Methods
Teacher Portfolios
Peer-to-Peer Interviews
Peer-to-Peer Learning Walk Surveys
Questionnaires
Self-reflection
Gateway Elements of Organizational Change
We often wonder why strategies gained from professional development are not meaningfully and consistently implemented by participants to impact excellence in teaching and learning. This is because too often it is found that leaders, colleagues, and organizations can micromanage and suppress the implementation of strategies. Therefore, it is important to measure and evaluate the levels of affective domains, the culture of learning and excellence, alignment to organizational and personal goals, autonomy to transfer and implement knowledge, equitable celebrations, and more to determine the support needed and being provided to leverage organizational change.
Evaluation Questions
- Is a culture of learning developed and maintained? What evidence other than words on a website or on a document reflects this?
- Were sufficient resources made available to support implementation?
- Were successes equitably recognized and shared?
- Were changes at the individual level encouraged and supported?
- How do you know if authentic and meaningful change occurred?
- Was implementation meaningful? How did you allow the teachers/educators to have autonomy in attaining excellence?
Data Collection Methods
Learning culture records
Learning Support Summaries
Participants’ Portfolios
Newsletters
Questionnaires
Support Surveys
Equitable Success Recognition
Assessment Scores
Enduring Empowerment and Excellence
When we provide professional development, it is the ultimate goal to empower participants with an infectious passion for ceaselessly attaining, researching, blending and authentically implementing knowledge at a high level of impact and excellence. Enduring empowerment and excellence might seem difficult to evaluate, but it is very simple because it manifests itself in educators’ and learners’ portfolios of ideated products and meaningful applications. To ensure that we are providing learners and educators with the ability to be lifelong autonomous and self-reflective learners, we need to focus heavily on evaluating qualitative data relating to showcased achievements, shared testimonials, meaningful success stories, newsletters, videos, blogs, infographics, and other artifacts.
Evaluation Questions
- How is autonomous, self-reflective, and continuous learning encouraged and supported for students and adults?
- How are educators encouraged and empowered to blend strategies gained from various PDs to impact learning?
- How do you provide educators with choice in generating evidence of learning and growth?
- How are learners using knowledge gained and products created in their lives and workplaces?
Data Collection Methods
Showcased Achievements
Shared Testimonials
Success Stories
Video Library
Blogs and Infographics
Newsletters
Websites
Paper Portfolios
Innovative Artificacts Library
We must engage in meaningful and ongoing evaluation to make sure the money and time spent on professional development correlates to the profound impact we intended it to have on developing innovative, adaptive, and lifelong learners who can make unique contributions to the world and workplace. If we fail to evaluate professional development, then we are just engaging in busy work.
To hear more from Cherry-Anne, connect with her on social media (@Chemelnalis)
References
Breslow, N., & Bock, G. (2020). Evaluating Professional Learning A Tool for Schools and Districts. REL Northeast & Islands.
https://ies.ed.gov/ncee/edlabs/regions/northeast/pdf/ne_5.3.5_evaluation_pd_brief_12-22-20_accessible.pdf
Calpro Online. Evaluation of Professional Development.https://www.calpro-online.org/pubs/Eval4.PDF
Educational Innovation 360. (2022). How Can Teachers Measure Their Professional Growth?
https://www.educationalinnovation360.com/blogs/how-can-teachers-measure-their-
Professional-growth
Guskey, T. R. (2002, March). Does It Make a Difference? Evaluating Professional Development.
Educational Leadership, 59(6), 45-51. Retrieved from https://tguskey.com/wp-content/uploads/Professional-Learning-4-Evaluating-Professional-Development.pdf
Johanson, T. Vision Our Impact: What Change Are We Hoping For?
https://johansonconsulting.ca/category/leadership/professional-development/
Learning Forward. (2014). Evaluating Professional Learning: Measuring Educator and Student
Outcomes.The Professional Learning Association.
https://learningforward.org/newsletters/transform/march
2014/evaluating-professional-learning-outcomes/