Wednesday, December 16, 2015

"To the International Baccalaureate ... And Beyond!"

I'm in a meeting, talking with veteran independent school teacher, Art L., and he is getting what we would call in Memphis, "all kinds of fired up." We are discussing the International Baccalaureate's (IB) rubric for teachers in the IB English Year 1 course, the rubric all IB English teachers must use to assess students' "Individual Oral Presentations" (IOPs). IB courses in English Literature are among the most advanced Lit courses offered at most IB schools and the IOP is one of those essential summative assessments in the rigorous IB English course. Art is less than impressed by the IB-developed rubric.

"Nowhere in this rubric are standards or criteria for students making a cohesive, logical argument," he laments. "It's possible for a student to simply be familiar with the text he's analyzing, to make good eye contact throughout the presentation, and to use specific terminology, which could have been blindly memorized; he could earn a perfect score by just doing those three things!"

Art is correct and his antipathy seems well justified. With only three standards to be assessed, the IOP rubric is overly simplistic for a summative assessment. The achievement bar, which we would expect to be set fairly high for an IB course, appears to be set at a baseline level in this case.

Art's eyes twinkle as we begin considering tweaking the IB-sanctioned rubric to increase the level of expectation and achievement. We are now talking about the sacred realm beyond the almighty IB, and we are both all kinds of fired up.

For my readers who don't know, the IB is a currculum originally designed in the late 60s for international schools, schools with very transient student and faculty populations. The idea behind the early IB was to create a static set of rigorous courses all with a set framework of peer-moderated assessments so that a student transferring from one international school to another international school could conceivably continue his or her studies. The framework and peer-moderation engendered a consistent set of courses that could be taught at any school. The "IB Diploma Programme," a two-year program of study designed for 16-19 year olds, also included a component for community service, activity within the school, and an emphasis on the pursuit of creative endeavors. Additionally, the Programme required participation in a type of philosophy course and also required all IB students to write an original research paper. The idea worked, and many international schools "adopted the IB program." Students who graduated with their "IB Diploma" reported being very well prepared for university. That success made transient parents of IB students, already happy that their children could continue their studies while moving from school to school, ecstatic. Because of its success and its relatively high level of academic rigor, the IB has become a kind of gold standard among internationally minded schools worldwide. Today, thousands of schools offer the IB, which has also expanded to offer separate programs for middle schools and even elementary schools. In my opinion, the IB is a strong, rigorous, and potentially very rewarding program.

But as Art and I are talking on this warm, sunny December day, we are agreeing that the "programme" ain't perfect.

For starters, IB teachers can sometimes become slaves to the structures of IB examinations and the banks of past IB examinations and papers. Through the years that I have been a part of teaching the IB, I have seen this tendency to "teach to the test" grow quite strong. In defense of the IB program, I have to admit that the structures of the various IB written assessments tend to be academically beefy, but as Art and I are finding, there are some exceptions in every IB discipline; the IOP discussed above being just one example. Teaching to the test can be a powerful experience if the end assessment is an excellent evaluative tool. If it isn't ... well ...

As IB teachers tend more towards teaching to the test, IB students become more test-obsessed. In my own IB classes, the question "Will this be on the test?" became an all too common refrain. Many times, I answered, "Yes! This content is from such-such location in the curriculum and syllabus guide." But sometimes I answered, "No, but this content will help you to better understand such-and-such topic that could be on the exam. Trust me." Despite my pleas, students tended to pay attention in the former case and to doze off in the latter case.

Each IB curriculum goes through a curriculum review cycle ever few years. The cycle is a proper evaluation in that during a given review cycle, each IB program is assessed for needs, design, content, implementation, and outcomes as measured against yearly examination results and feedback from current teachers. The idea of conducting a regular review cycle is fantastic, making for a dynamic curriculum. But sometimes programs that "ain't broke" get "fixed" nonetheless. Consider that in the past few years, the IB Design Technology program has de-emphasized providing students with opportunities to actually create stuff. The IB Theater program has come to de-emphasize providing students with opportunities to actually perform stuff. And as Art's experience above illustrates, the IB English program may be de-emphasizing providing students with the opportunity to make logical sense of stuff.

During my later years as an IB Economics teacher, I tried to solve some of these problems by developing instructional units tightly based on the IB syllabus, but in some cases going beyond it. I would not add additional units to those suggested by the IB Economics syllabus, but I would augment each unit with additional lessons, some designed for struggling students, some designed for accelerated students. As an example, the IB Economics syllabus does not ask students to derive a demand curve. Such a derivation was a part of my course, and I found that with my weaker students, deriving a demand curve helped them to better conceptualize what 'demand' really is. For more able students, my course allowed students to opportunity to delve into 'supply' and 'sustainability.' The IB syllabus does not specify that teachers prompt students to consider the linkages between these two concepts, but I thought it both relevant and important to spend some time considering the limit of such linkages. Given my students' IB exam averages, they were certainly not hurt by my course's time allocations.

I wonder how many IB teachers make similar adjustments? Based on my current conversation with Art, I know another teacher that is heading down that path.

Later that day, second-year IB students pops his head into my office and tentatively asks me if I can answer an Econ question for him. I am an administrator and have not taught Econ in a few years, but many of my school's students have found my YouTube website, one devoted to helping folks to better understand the intricacies of both the IB and the AP Economics syllabi. I tell the student to come on in. His question is about the role of the central bank in the economy (something specified on his IB Ecnomics syllabus). He wants to know about the central bank and interest rates. I ask him if he has studied the money market and the loanable funds market (two topics not specified on the IB Economics syallabus but that were a part of my old Econ course). He responds negatively. We spend the next fifteen minutes discussing both markets and how they relate to central banks and to almost every other bank in most economies. He is a good student and he has studied other markets to the extent that his is able to catch on to the mechanics of these two markets. I see the light bulb go on as he is able to easily answer his initial question now. He feels great, leaving my office with a greater degree of confidence.

Looking back, he smiles and asks, "Why aren't these markets on our syllabus?"

Good question.

Thanks for reading. -Kyle

Sunday, November 29, 2015

A Different Way of Thinking about High School Final Exams

I am perusing a high school's final exam schedule, the one I created months ago. In my current post as High School Assistant Principal, it is my job to engineer the final exam schedule, checking and double-checking for accuracy and then posting the result to anxiously awaiting students and their families via a shared Google doc. This process is accomplished several months in advance of the actual exams, so the December exam schedule is done and dusted (and posted) in October.

To put things in perspective for you, exams at my school stretch out over a four-day period. On each day, two multiple-hour exams are given: one session lasting from 9-11 AM and the next session held from 1-3 PM. During "exam week," the regular school schedule is suspended; only exams take place during the week. Once a student is finished with an exam on a particular day, he or she can return home. Woohoo.

It's my job to put the schedule together, and it is a tedious affair. Courses, sections, section sizes, physical spaces, rooms, desks and chairs, and proctor availability all must be taken into consideration. Each course is given a specific slot in the overall exam schedule, each course being further divided into several sections of different groups of students. Scheduling all sections of a course to have an exam on a particular day and time saves the teacher from having to create several different versions of his or her exam, but it also unfortunately makes the creation of a separate examination schedule a complex necessity. The result is a sizable matrix, a jumble of teacher names, room numbers, class sizes and proctor names.

As I am perusing the schedule for the 90-millionth time, quietly fretting about individual student conflicts and possible errors, I am becoming despondent. In just a few days, several hundred of our students will sit down for a relatively stress-filled, two hour period to demonstrate what they know, what they have learned. At my school and hundreds of other secondary schools, I imagine that for the most part, students are sitting down to a teacher-made booklet of questions: some multiple choice, fill-in, and short answer. Some exams will include a short essay component. A handful of exams will be entirely essay based. With expectations that exam results are to be posted before students and teachers depart for winter break, the schedule is probably tight and does not leave room for much other than an SAT-style of exam. So students demonstrate their learning in a two-hour session on a piece of paper mostly composed of objective questions. I am thinking that there must be a better way.

There is.

First, let's dispense with the traditional end-of-semester timetable for examinations. Like the structure of the rest of the academic year, the end-of-term exam schedule is a throw-back to the late 1800s. That calendar structure was itself created by social reformers trying to find a compromise between the agrarian calendar that was shaping most peoples' lives a hundred years ago, and the emerging urban work calendar that was shaping the lives of a growing number of city denizens. With the vast majority of today's students not necessarily needing to return home to help out on the farm, surely it is time we revisit the notion that exams must be given before plantings and harvests.

I'd like to suggest another reason for throwing out the end-of-term examination schedule: the nature of learning itself. By creating an exam schedule divided into two-hour blocks, we are essentially telling students that they damn well better demonstrate their learning in this two-hour session, otherwise we will assume that learning has not really taken place (and we will the fail them). Current exam schedules favor students with faster processing speeds and stronger memories. Are these "quick" students smarter than slower students with weaker memories? Not necessarily. But continuing to pursue an end-of-term exam schedule, we stack the exam deck against certain kinds of students and thus garner skewed results about how much learning our students have accomplished. I bet you we under-estimate learning and damage certain students' self-esteem at the same time. Bravo, us.

Secondly, let's start (or continue) having widespread discussions about what constitutes a quintessential summative assessment. (For my non-teacher readers, a summative assessment is typically a large-scale assessment given at the end of a unit, project, course, year, etc. to determine learning and achievement). How many of us would say, "Yes! A quintessentially summative assessment for my discipline is a multiple choice test"? Anyone out there want to own up to proudly proclaiming something like this? God, I hope not. And yet that is exactly the unspoken claim many of us silently make when we create end-of-term assessments made up of mostly objective questions. "Here, take this 100-question multi-choice test, it's the best instrument I've got to determine achievement. Show me what you have learned." Excrement; pure excrement.

When we are having these strategic discussions about quality summative assessments, I would like for these discussions to be truly widespread among the teachers at my school. And while I am wishing, I would also like for each teacher to really think about this and to come up with examples and answers. Here! I'll start the ball rolling. I taught high school Economics for years. If I think about it, the quintiessential assessment for a student in one of my courses should probably be some kind of written economic analysis, full of graphs, data, and reasoned arguments from multiple perspectives. Before delivering such a summative assessment, I would need to scaffold some of the skills required for students to be able to successfully complete such a task. I would have to first define what I meant by an economic analysis, and I would then have to equip students with some of the tools used to deliver such an analysis. I would have to show my students an exemplary specimen of such an analysis, explaining to my students, probably via a rubric, why the specimen is exemplary. Maybe students would need to see multiple examples! Then students would need to practice and get feedback. And then practice again, and maybe again. Then they would be ready, although some may need more time (back to my original point about the exam schedule). In the end I'd have a tool for determining achievement that does me and my subject area proud. Incidentally, I would also be setting the bar for student achievement and perseverance pretty high. If I know my students, almost all would try to live up to those high standards of achievement and perseverance.

What would constitute quintessential assessments in other disciplines? I am not a scientist, but in a science class, I should think that some sort of hypothesis-testing, lab, or lab report would be strong candidates. How about in a literature or English class? Again, not my professional bailiwick, but some form of rough-revision-writing should be considered. In a secondary language class? Perhaps an assessed oral conversation coupled with a written piece. Design Technology? A finished product maybe, or a detailed, written production analysis. Again, these disciplines are not my field; I am simply brainstorming, getting the ball rolling. These discussions and decisions are ultimately up to the professionals in the classroom. The result however, is the same as I shared above: a summative assessment tool or model that allows most if not all students the latitude and creative room to demonstrate their level of achievement. And the tool or model used is a proud reflection of our own professional interests and passions.

If we adopted such a model, then our "exam calendar," if you will, would actually be the school year itself, not some artificial, anachronistic, century-old carry-over. If adopted, such a model would help us teachers to promote the idea of summative testing being "knowing and doing" instead of simply knowing within a certain, specified time frame that may or may not be realistic per student. If adopted, such a model would, just maybe, leave students and teachers feeling less like they were jumping through hoops, especially towards the end of a term.

Now I know what some of you might be thinking. You are thinking that in the real world, the SAT is king, and if we don't prepare students for something like an SAT, then we are not preparing them for college and/or life later on. Let me reject that line of thinking with a question: when in your professional life was the last time that you took a multiple-choice test as a part of your job?

Thanks for reading. -Kyle