Monday, December 7, 2009

Finding Leverage Points and Measuring Classroom Success

I've worked with hundreds of teachers from many districts. For too long local and state administrators, my training staff, and I were overly impressed with the progress teachers were making at our institutes. Don't get me wrong, there was good reason to be impressed. Walking around the room, you had to admire the professional dialog and collaboration while teachers created new lesson plans and materials. The problem was not recognizing and celebrating this great work, the problem was failing to focus on what happen in classrooms after the institutes.

During staff development, as we increased the connection between research-based strategies and instructional design, we increased our ability to predict positive outcomes in classrooms. But we could only predict classroom success at the institute. Those of us focused on measuring success in classrooms knew too little was being transferred from the institutes to actual teaching practice.

During my classroom follow up, again and again I would find lessons used poorly if used at all. I could attribute this failure to a lack of support at the school. And I knew this lack of support was due in part to the design of my professional development. I failed to define for administrators what successful change looked like. I failed to clearly define how administrators could and should support new teaching.

Instead, during the last day of the institutes we celebrated presentations of good lesson plans. The presentations were quite good and full of good intentions. Similarly, during the school year when lesson plans were due, school administrators celebrated the vastly improved lessons. However, this focus on lesson plans created a false hope about what was actually happening in classrooms. We assumed too much. It was convenient to assume better lesson plans led to better teaching. But this was true in only a small percentage of classrooms.

To have more of an impact on learning, we had to change our professional development design. To be more certain of the impact of PD on classroom teaching, we needed to focus beyond the lesson plans by measuring and celebrating better teaching.

This idea wasn't new. We've known this to be true through research. I did believe the research that suggested I needed to define measurable classroom outcomes, but I hesitated. We use an evidence-driven design process, and I hesitated to define what evidence looked like in an individual classroom because evidence varied for each lesson. How could we define measurable change with so many variables?

However, after another year of more impressive lessons followed by more unimpressive classroom results, I decided I somehow had to give definition to classroom success - I had to define better teaching as a result of professional development. This was a turning point.

Once I decided I had to define measurable classroom-level criteria for change, answers came rather quickly because of revisions to professional development that had already begun.

We were struggling to make better connections between instructional planning and research-based strategies. To help close this gap, we began doing panel reviews and peer reviews, and more, wherein teachers would have to describe in detail and show us that they were prepared to, for example, activate prior knowledge, teach similarities and differences, or increase reading comprehension.

Quickly, with the addition of these reviews, the importance of a particular type of evidence was made more clear. As teacher after teacher was challenged to demonstrate how they would facilitate research-based, reading, and writing strategies, written responses were demonstrated as a leverage point for better teaching. This is an important leverage point.

To further understand the impact of written responses, it is helpful to look at one way evidence is classified by Wiggins and McTighe:

1. Final products: (e.g., projects, models, exhibits)
2. Quizzes and tests
3. Public performances ( e.g., presentations, role play)
4. Oral responses (e.g., questioning, interviews)
5. Observations (e.g., using observation checklist)
6. Written responses (e.g., Organizers, notes, summaries, papers, reflections)

Of the six classifications of evidence above, written responses provide teachers the most opportunity to facilitate research-based teaching and literacy strategies as students interact with the materials in their own handwriting. Conversely, quizzes, tests, and other forms of evidence relate more to simple checks for understanding than they relate to research-based teaching and literacy integration. Before I illustrate this in more detail, it is important to note that we combined the emphasis on written responses with an emphasis on certain text structures.

More than any other text structure, high school students struggle consistently with expository text patterns. Yet these informational-based content patterns are at the heart of so much new content to which students are exposed in our world today. Using expository text structures as a leverage point for academic integration and improvement in this instructional design course, the patterns become the foundation from which we create opportunities to clarify and focus content, and to teach content more effectively using research-based teaching, reading, and writing strategies.

Note that expository patterns include Description, Enumeration, Time/Sequence, Compare/Contrast, Cause/Effect, and Problem/Solution. With these text structures in mind, a well-crafted “sequence” organizer, for example, can be used to teach steps for preparing a recipe, a science experiment, a timeline in history, a procedure in the medical field, a procedure for solving a math problem, repairing a computer, or a sequence of events for framing a wall in the construction class.

Now more thoroughly throughout the design process, teachers invent custom materials that become classroom evidence (i.e., student work samples) of research-based strategies such as Similarities & Differences, Advanced Organizers, Note taking & Summarizing, Pre-Assessing, Activating Prior Knowledge, and Reading and Writing strategies. As teachers alternate from independent to collaborative work, they produce practical tools that leverage multiple research-based strategies. This level of design makes change personal while creating a powerful sense of predictability and control for doing something new. Moreover, these materials create opportunities for teachers to provide feedback; and students generate an archive of learning from which to build upon, study, and revise.

Lying within this leverage point was an answer for measuring success. We have since refined tools and methods for analyzing student work samples. Not only did this provide evidence of classroom change, these samples provide a means of collaborative learning where teacher change happens best - in and around classrooms.

By designing professional development for classroom implementation, administrators can now look beyond curriculum maps and lessons plans to the materials used to teach. If professional development has planned for it, administrators can ask their teachers for evidence of research-based teaching, and they can ask for evidence of supporting reading and writing in every classroom. Asking for evidence of improved classroom teaching, facilitating an analysis of work samples, and sharing and celebrating results, are practical ways of offering critical system support and continuing the improvement cycle.