Monday, December 7, 2009

Engaging Teachers with a Streamlined Version of Understanding by Design


Excitement filled the room as curriculum specialists and web designers from four universities designed the coolest unit design template. There was no shortage of ideas with valid arguments about what should be included in the final design.

Later, the excitement was quite different when I introduced this template to my first group of teachers. As the template was revealed, I could see the teachers deflate in their chairs. I'll never forget the hope in their faces turning into something more lifeless. But bless their hearts, willing to comply, they were soon going from box to box, and dropdown menu to dropdown menu, filling in the blanks. Like robots, they were filling in the blanks. I vowed to never again visit this day on another teacher. This was nine years ago.

What happened? The design model was sound, the research-based strategies were sound, and the template was way cool. The problem was that it became more about the template than their classroom.

I contrasted this day to a another day when a room of teachers was filled with laughter and creative passion; new ideas seemed endless, and energy was off the charts. The creativity was raw, innocent, intimate, and personal. That day, we were scanning and editing family photos, and shooting and editing video.

Discovering fast and easy ways to teach photo and video editing, it was easy for teachers to focus on their content. The process was so streamlined the day was less about the program and more about what mattered to the teachers that day - pictures of their families and stories they wanted to tell.

I decided I had to streamline the lesson design process and somehow get the creative ink flowing. I had to make it less about the improvement model, less about the template, less about the research, and more about what mattered to teachers - teaching and students. Somehow I had to get teachers to stand willing and open-minded in front of the canvas of instructional design. This streamlining was the most important thing I've done to engage teachers and help them find meaning, purpose, and motivation in the improvement process as they make it their own.

To do this, I began by identifying the overall goal in simplest terms; weave research-based teaching into the alignment of standards, curriculum, instruction, and assessment. Keep in mind there has been a long evolution of strategies and materials to make this what it is today, but I'll outline the process. By the way, you'll see I'm a big fan of backwards design.

The basic process is this:

1. Have teachers select very specific objectives from which to begin. Hopefully, performance objectives that represent where students are struggling the most.

2. Brainstorm a body of evidence. Challenge the existing practice and focus on written evidence because this leverages many research-based strategies.

3. Sequence the evidence into a beginning, middle, and end of lesson. Every day has a beginning, middle, and end full of research-based opportunities.

That's it. We quickly begin a process of improvement by linking and aligning their objectives to evidence of student understanding. Teachers soon believe this is about their teaching, their students, and their classroom.

The beauty about this focus on evidence is that it begins by meeting the teachers where they are ( i.e., the evidence they already have). However, we then expand and challenge their body of evidence, which in reality is expanding how they might teach differently while clarifying and focusing instruction. We focus on their content and focus their content.

We emphasize written evidence because of its ability to facilitate research-based strategies such as pre-assessing, activating prior knowledge, similarities and differences, advanced organizers, note taking and summarizing, providing feedback, reading, and writing. Then, following instruction and classroom use, written evidence in the form of student work samples provides a means for analyzing results and measuring success.

Even imperfect at first, the hands-on experience allows teachers to begin to see and feel an alignment of standards, curriculum, instruction, and assessment. They begin to share a common language. Objectives, content, and instruction become more focused. Data finds meaning in the cycle as a tool for teachers to target curriculum for renewal that will have the most impact on achievement. Teachers begin to feel predictability and control for doing something new.

Streamlining instructional design can ignite confidence, raise expectations for the achievable, and breathe life into small, intimate, meaningful cycles of improvement. Quick, practical cycles of designing, teaching, analyzing results, and making revisions, provide purpose for collaboration and collective learning.

When a specific curriculum is open to renewal, one that defines moments of instructional time, change is more likely. Working beyond lesson plans, creating materials of practice, the best of what we already know and the best new ideas can be evaluated, tested, and fit for their proper place in a journey to increased student achievement. Conversely, without these small cycles of change, without making materials ready to teach, new ideas are quickly overwhelmed by a relentless stream of other ideas, soon fading beyond the reach of new classroom practice.

Streamlining is hard because we know so much about what works, and there will be valid arguments to do many things. It is too easy to underestimate the personal implications of change and overestimate the amount of content that can be successfully assimilated into classroom practice.

Finding Leverage Points and Measuring Classroom Success

I've worked with hundreds of teachers from many districts. For too long local and state administrators, my training staff, and I were overly impressed with the progress teachers were making at our institutes. Don't get me wrong, there was good reason to be impressed. Walking around the room, you had to admire the professional dialog and collaboration while teachers created new lesson plans and materials. The problem was not recognizing and celebrating this great work, the problem was failing to focus on what happen in classrooms after the institutes.

During staff development, as we increased the connection between research-based strategies and instructional design, we increased our ability to predict positive outcomes in classrooms. But we could only predict classroom success at the institute. Those of us focused on measuring success in classrooms knew too little was being transferred from the institutes to actual teaching practice.

During my classroom follow up, again and again I would find lessons used poorly if used at all. I could attribute this failure to a lack of support at the school. And I knew this lack of support was due in part to the design of my professional development. I failed to define for administrators what successful change looked like. I failed to clearly define how administrators could and should support new teaching.

Instead, during the last day of the institutes we celebrated presentations of good lesson plans. The presentations were quite good and full of good intentions. Similarly, during the school year when lesson plans were due, school administrators celebrated the vastly improved lessons. However, this focus on lesson plans created a false hope about what was actually happening in classrooms. We assumed too much. It was convenient to assume better lesson plans led to better teaching. But this was true in only a small percentage of classrooms.

To have more of an impact on learning, we had to change our professional development design. To be more certain of the impact of PD on classroom teaching, we needed to focus beyond the lesson plans by measuring and celebrating better teaching.

This idea wasn't new. We've known this to be true through research. I did believe the research that suggested I needed to define measurable classroom outcomes, but I hesitated. We use an evidence-driven design process, and I hesitated to define what evidence looked like in an individual classroom because evidence varied for each lesson. How could we define measurable change with so many variables?

However, after another year of more impressive lessons followed by more unimpressive classroom results, I decided I somehow had to give definition to classroom success - I had to define better teaching as a result of professional development. This was a turning point.

Once I decided I had to define measurable classroom-level criteria for change, answers came rather quickly because of revisions to professional development that had already begun.

We were struggling to make better connections between instructional planning and research-based strategies. To help close this gap, we began doing panel reviews and peer reviews, and more, wherein teachers would have to describe in detail and show us that they were prepared to, for example, activate prior knowledge, teach similarities and differences, or increase reading comprehension.

Quickly, with the addition of these reviews, the importance of a particular type of evidence was made more clear. As teacher after teacher was challenged to demonstrate how they would facilitate research-based, reading, and writing strategies, written responses were demonstrated as a leverage point for better teaching. This is an important leverage point.

To further understand the impact of written responses, it is helpful to look at one way evidence is classified by Wiggins and McTighe:

1. Final products: (e.g., projects, models, exhibits)
2. Quizzes and tests
3. Public performances ( e.g., presentations, role play)
4. Oral responses (e.g., questioning, interviews)
5. Observations (e.g., using observation checklist)
6. Written responses (e.g., Organizers, notes, summaries, papers, reflections)

Of the six classifications of evidence above, written responses provide teachers the most opportunity to facilitate research-based teaching and literacy strategies as students interact with the materials in their own handwriting. Conversely, quizzes, tests, and other forms of evidence relate more to simple checks for understanding than they relate to research-based teaching and literacy integration. Before I illustrate this in more detail, it is important to note that we combined the emphasis on written responses with an emphasis on certain text structures.

More than any other text structure, high school students struggle consistently with expository text patterns. Yet these informational-based content patterns are at the heart of so much new content to which students are exposed in our world today. Using expository text structures as a leverage point for academic integration and improvement in this instructional design course, the patterns become the foundation from which we create opportunities to clarify and focus content, and to teach content more effectively using research-based teaching, reading, and writing strategies.

Note that expository patterns include Description, Enumeration, Time/Sequence, Compare/Contrast, Cause/Effect, and Problem/Solution. With these text structures in mind, a well-crafted “sequence” organizer, for example, can be used to teach steps for preparing a recipe, a science experiment, a timeline in history, a procedure in the medical field, a procedure for solving a math problem, repairing a computer, or a sequence of events for framing a wall in the construction class.

Now more thoroughly throughout the design process, teachers invent custom materials that become classroom evidence (i.e., student work samples) of research-based strategies such as Similarities & Differences, Advanced Organizers, Note taking & Summarizing, Pre-Assessing, Activating Prior Knowledge, and Reading and Writing strategies. As teachers alternate from independent to collaborative work, they produce practical tools that leverage multiple research-based strategies. This level of design makes change personal while creating a powerful sense of predictability and control for doing something new. Moreover, these materials create opportunities for teachers to provide feedback; and students generate an archive of learning from which to build upon, study, and revise.

Lying within this leverage point was an answer for measuring success. We have since refined tools and methods for analyzing student work samples. Not only did this provide evidence of classroom change, these samples provide a means of collaborative learning where teacher change happens best - in and around classrooms.

By designing professional development for classroom implementation, administrators can now look beyond curriculum maps and lessons plans to the materials used to teach. If professional development has planned for it, administrators can ask their teachers for evidence of research-based teaching, and they can ask for evidence of supporting reading and writing in every classroom. Asking for evidence of improved classroom teaching, facilitating an analysis of work samples, and sharing and celebrating results, are practical ways of offering critical system support and continuing the improvement cycle.