There Will Be No Beanstalk

What will it be next year? Which book or program will capture the imagination of America’s school administrators? Which teacher turned thought leader will have her fortunes changed over night? Which consultant, too opportunistic and cowardly to remain in the arena and teach actual students, will be charging thousands of dollars to tell teachers how to do their jobs? Which business concept will weasel its way into America’s schools? What new elixir will I be forced to choke down, as impotent to resist as a baby whose mother airplanes a spoonful of unappetizing gruel toward his pinched mouth?

I do not know, but experience suggests it will be something. Likely, it will be something I’ve sampled before, under new management and packaged in a more attractive box. Something tasted by teachers who, after masticating for a while and maybe even swallowing, eventually spit it back up, only to chase it with something equally specious and unfulfilling.

We teachers are willing converts, regardless of how many times we’ve enthusiastically purchased the snake oil in the past. Sent off to a conference on the latest educational wonder drug, our initial skepticism is quickly replaced with reluctant acceptance by some and acolytic zeal by others. Our principals stand in front us with a tenuous grasp of the panacea they offer and virtually no understanding of the underlying science, but they assure us that it’s “research-based.” They point to a district where it supposedly worked, neglecting to mention that said district bears no resemblance to our own.

Still, we nod our heads. We sit in staff meetings where we are told that this, yes this, is our salvation! The magic bullet that will finally, finally raise those test scores, send more kids off to college, and make our schools the place everyone wants to be. Stick a Ph.D. on the end of a name and watch us assent under the assumption that someone smarter than us has the answer.

The remaining skeptics among us won’t dare say anything for fear of being labeled negative, or difficult, or not a team player, or not in it for the kids. No reason to place a target on our backs, not when we’ve been here before and know that this too shall pass.

And maybe in the back of our minds we think — having been told in so many ways over so many years that we’ve never measured up, never given these kids what they deserve — that, why not? Why not try this new thing? After all, what we’ve been doing hasn’t exactly been setting the world on fire.

Teachers, I think, often feel like Jack’s mother in the fairytale Jack and the Beanstalk. At our wit’s end, on the verge of giving up, and as a last-ditch effort, we decide to trade in the family cow. We’ve barely been getting by as it is. Nothing is working and it never will. Desperate, we hope for deliverance. After all, anything is better than a useless cow.

And wouldn’t you know it? There’s a peddler offering just the thing. Magic beans! The answer to all our troubles! Consultants, books, new programs, repackaged ideas, all sold by slick traffickers who, unlike us, were savvy enough to make a living in education outside of the classroom.

But teaching isn’t a fairy tale and there will be no beanstalk that teachers will climb to heretofore unattained heights. There is no magic. No riches. No geese who lay golden eggs. No magic harp. Not even an enraged giant or his concerned wife. They may be different sizes and colors than the beans we’ve planted before, but they’re still just beans.

Still, there will be hope. The newly acquired beans planted, we’ll look out the window, expecting that any day now we’ll wake up and see a beanstalk. We’re sure of it.

This is the curse of being a teacher. We will forever be hoping the beans will sprout. No matter how many times they fail to germinate, we will always trade away the cow in the hope of something transformational. And instead of scolding us for our foolishness, as the mother does Jack in the story, our leaders will present to us new beans with promises that this time we will surely be able to climb to the clouds.

Undeterred by broken promises, we will believe again. We’ll return to the window and stare at the soil, positive that this time there will be growth.

The eagerness to drink the Kool-Aid is our curse.  It is also our blessing.

For what is teaching if not blind hope? Why keep showing up if you don’t carry within you an implausible faith in miracles? If teachers believe that they, through nothing more than their dedication and efforts, can turn a kid around who has everything going against him, then is it at all surprising that when a man offers to trade magic beans for our tired cow we jump at the opportunity?

We believe in miracles because we believe in the biggest miracle of all: That we, set against apathy and neglect, hunger and abuse, poverty and hopelessness, can make a difference. Against all odds, we believe in the future of every single student. It’s an absurd belief, one that no rational person would hold, one that the data have never supported, yet we believe it with every fiber of our being, just as we believe that this time, there will be a beanstalk.

______________

If you’d like to receive new articles in your inbox,  subscribe here.

 

The IKEA Effect of Lesson Creation

The following is an excerpt from my new book, Leave School At School: Work Less, Live More, Teach Better. It’s available in both Kindle and print forms on Amazon.

I eat in the teachers’ lounge, and almost every day someone brings in one of those Lean Cuisine frozen lunches and pops it in the microwave.  You can trace the origins of such convenience foods to the years following World War II. The military had developed MREs and other foods meant to withstand long periods of storage and allow for easy preparation on the battlefield. After the war, several commercial food companies had leftover manufacturing facilities, so some of them created new freeze-dried and canned food products for domestic use. They pumped out boxes of fish sticks, canned peaches, and even ill-fated cheeseburgers-in-a-can. Jell-o introduced new dessert flavors throughout the 1950s. Sales soared.

With so many new products to sell, advertisements swept across the amber waves and purple mountains, reminding Americans again and again how busy they were, how hectic their days had become, and how desperately they needed quick meals. “If you’re a typical modern housewife, you want to do your cooking as fast as possible,” wrote a columnist at Household magazine who was promoting instant coffee and canned onion soup. Kellogg’s even created cereal that could be served faster. Their ads claimed that busy moms loved their presweetened Corn Pops. Because who had time for the laborious task of sprinkling on a spoonful of sugar?

TV dinners. Minute rice. Instant potatoes. “Hot breads—in a jiffy!” All were peddled to harried housewives who just didn’t have enough hours in the day to cook like their mothers had. “It’s just 1-2-3, and dinner’s on the table,” exclaimed an article in Better Homes & Gardens. “That’s how speedy the fixing can be when the hub of your meal is delicious canned meat.” [1]

But the faster the cooking, the less it felt like real cooking and the greater the potential for guilt on the part of the homemaker. That was the problem with instant cake mix. Intended to save busy housewives time by simply adding water to a mix, stirring, and popping in the oven, instant cake mix seemed like a fantastic idea. But sales fizzled after a few years. It turned out that TV dinners or the kids’ cereal were one thing, but a cake — well, that was another matter. Any homemaker worth her salt wouldn’t make a generic cake from a box that couldn’t be distinguished from a cake baked by the guests she was serving it to.

When marketers dove in to uncover what went wrong with cake mix, they discovered that it was too easy. The solution was simple: Have the baker add an egg. Once the powdered egg was removed from the mix, sales recovered and instant cake mixes became a mainstay in nearly every home in America. By adding one step to the mixing process, homemakers felt they were really baking again.

The cake mix lesson has since been repeated many times over. Build-a-Bear sends you the raw materials and the directions, but it’s up to you to actually build the bear. Cooks at “patron-prepared” restaurants like Mongolian Barbecue will cook the food for you, but only after you select the ingredients. City-dwellers take “Haycations,” where they pay farmers to do their work for them. And of course, there’s IKEA, which sells furniture at a discount because buyers have to build their own bookcases, cabinets, and tables. In each of these instances, people seem to place more value on items to which they have contributed some labor.

With this in mind, three psychologists, Michael Norton, Daniel Mochon, and Dan Ariely, conducted a series of studies to find out whether consumers would, in fact, pay more money for products they themselves assembled. The research consisted of three different experiments.

In the first experiment, researchers found that participants were willing to pay 63% more for furniture they had built over furniture that came pre-assembled.

In the second experiment, Norton, Mochon, and Ariely asked subjects to make origami frogs or cranes. They then asked the subjects how much they were willing to pay for their own work. Following this, researchers gathered another group of volunteers who had not created any origami. These new subjects were asked how much they were willing to pay for origami built by the participants. Then the researchers asked how much they were willing to pay for origami built by an expert. These people, who had no personal connection to the creations, were willing to pay more for the expert’s products, which is exactly what one would expect. The participants who had made the origami frogs and cranes were then shown a display of origami that consisted of one set they had built themselves and one set that had been built by the experts. They were asked to bid on the different origami. The builders perceived the origami they had created as being of equal quality to those created by the pros.

The results of these studies suggest that when people construct a particular product, even if they do a cruddy job of it, they will value it more than if they had not put any effort into its creation.

Participants, wrote Norton and colleagues, “saw their amateurish creations as similar in value to experts’ creations, and expected others to share their opinions.”

The psychologists dubbed this the IKEA effect.

Two Problems For Teachers

There are two problems the IKEA effect creates for teachers. The first is that what you make is likely not nearly as good as you think it is. Your rubric is not better than another teacher’s. You just think it is because you made it. Same goes for everything else you’ve created. You would almost assuredly be better off using a product made by someone else. And as much as you don’t want to hear it, you’d be best off using products created by people whose job is to create those products. So while it may offend your sensibilities, stick with the program your district spent thousands of dollars on because it’s probably better than anything you’re going to design.

The second lesson is that there is a cost to spending time creating stuff. If you spend an hour making a magnetism unit because you tell yourself it will be better than anything you currently have in your filing cabinet or that you can find online, then you’ve lost the opportunity to spend that hour doing other things. You could have used the time on something that will make a difference for your students. You could have spent it doing an activity you enjoy. You could have even taken a nap during that hour and gone to work the next day better rested. The science is harsh but clear: If you’re a teacher who creates his own materials, you’re wasting your most precious resource making stuff that isn’t very good, in spite of the fact that you can find better resources with a few clicks of your mouse, or even more simply, by opening your teacher’s guide.

For the teacher looking to improve his effectiveness while spending less time working, the IKEA effect gives you permission to stop making stuff and steal (or purchase) from others.

——–

[1] Shapiro, Laura. Something from the oven: reinventing dinner in 1950s America. Penguin Books, 2005.

_________________________

Have you subscribed to the blog yet? If not, just click subscribe and you’ll be sent the week’s articles each weekend.

4 Civil Rights Lessons Worth Teaching

Guest Post:

Here at Owl Eyes, we’ve recently been publishing and annotating primary source documents from American history. Some of the most illuminating texts to read and write about have been those from the Civil War and Reconstruction eras. The conflicts waged and resolutions struck in those years have done much to shape the United States as it stands today.

One of the most critical laws passed in the wake of the Civil War was the Civil Rights Act of 1866, a law that defined American citizenship and sought to protect African Americans from those who wished to take away their rights as citizens. Revisiting the Civil Rights Act of 1866 in 2018 reveals some fascinating and relevant lessons about civil rights, political change, and government in the United States. For educators interested in teaching the history of civil rights in the classroom, these lessons are well worth discussing.

1. Civil rights must be fought for and won.

The first century of American history tells us that civil rights are not merely granted. They must be fought for, delineated in painstaking detail, and carefully preserved for future generations. Rights require work. Because the founders set sail on the waters of nationhood in order to be free of the tyranny of the British crown, it is tempting to view “Life, Liberty, and the pursuit of Happiness” as the default condition of the American individual. Not so. The values set forth in the founding documents mark an ideal to strive toward, a national myth, not a description of American life.

In the century following the nation’s birth, one particular issue increasingly revealed the gulf between the dream of Jefferson’s “unalienable rights” and the stark realities of the young nation: slavery. Anyone wondering whether “all men are created equal” needed only survey the back-breaking slave labor that fueled the cotton plantations of the South to discover a resounding answer. The North noticed the problem. Cue the Civil War.

The scale of the war—its costs and casualties—revealed the split visions of American values. In the North, “Life, Liberty, and the pursuit of Happiness” were seen as rights for all Americans, or at least all men. In the South, the phrase read like a list of privileges for wealthy white men, especially those who owned land and slaves. Even after the North defeated the South in war, the progressive politicians in Congress had to pass a bevy of laws, acts, and constitutional amendments in order to clarify that “all men are created equal.” These laws, which sought to give freedom, citizenship, voting rights, and safety to African Americans, received pushback at every step. The basic tenets of civil rights needed to be refreshed in the mid-20th century and remain debated to this day. Civil rights always need to be fought for.

Discussion Questions for Teachers and Students:

What do Jefferson’s “unalienable rights” to “Life, Liberty, and the pursuit of Happiness” mean to you?
What are the similarities and differences between the Civil Rights Movement of the 1950s-60s and the progress made by Congress during Reconstruction?
Which civil rights issues do you find most relevant and pressing today?

Recommended Reading:

The Declaration of Independence
Frederick Douglass’s “What to the Slave is the Fourth of July?”
The Thirteenth Amendment to the Constitution

2. Government is an evolving process.

The other political issue that dominated American politics in the 19th century was states’ rights. The issue of states’ rights—which persists to this day—refers to the struggle between the federal government and the state governments over which level of government should have the power to pass and enforce laws. While the founders of the United States sought to lay out in clear terms the systems of government, one question arose and, unanswered, began to hover like a storm cloud over American politics: How much power should the federal government possess?

The dispute over this exact question defined the Civil War and Reconstruction. In many ways, this clash mirrored the clash over slavery and civil rights. Put simply, the North wanted to end slavery and expand civil rights and therefore wished to arm the federal government with the powers to do so; the South wanted to maintain slavery and limit civil rights and therefore wished to keep the federal government too weak to change anything.

It is no surprise, then, that the push for freedom, equality, and civil rights in the 1860s also empowered the federal government. That trend began with the Civil War. The Union’s victory over the Confederacy was, in itself, a victory for the federal government over the autonomy of the states. Much of the progressive legislation of the time explicitly declares, along with each new law, the federal government’s authority to enforce the law across the states. The Fourteenth Amendment included a critical clause that allowed the entire Bill of Rights to be incorporated into the state legislatures. Revisiting this historical period reveals how the federal government itself is an ever-evolving process rather than a fixed reality.

Discussion Questions for Teachers and Students:

Over the course of American history, how has the relationship between the federal government and state governments changed? Do you think the federal government has become more or less powerful? Explain your reasoning.
Some politicians and historians have argued that the Civil War was more about the issue of states’ rights than slavery. How valid is this claim, and why?
To what extent is a strong central government needed to instate and enforce civil rights? Is it possible to institute civil rights at the state or local level? Why or why not?

Recommended Reading:

The Bill of Rights
The Reconstruction Acts of 1867 and 1868
The Fourteenth Amendment to the Constitution

3. There have always been progressives and conservatives.

If the current political landscape appears to be a staged clash between progressive and conservative sides, it has always been so. During the American Revolution, there were the Federalists and the Anti-Federalists; today there are the Democrats and the Republicans. The times change, as do the names of the political parties, but this central polarity does not. Progressives are more egalitarian, pushing social reforms and large government programs. Conservatives are more independent, seeking to minimize the government’s involvement in human affairs. Progressives want change; conservatives want tradition.

In a reversal of today’s parties, the progressives of the Reconstruction era were known as Republicans; the conservatives, Democrats. The push for civil rights in the United States has always been a progressive agenda, and so it was the Republicans of the 1860s—known as the “Radical Republicans”—who emancipated the slaves, abolished slavery, created the Freedmen’s Bureau, and gave African Americans citizenship and the right to vote. The 1860s represent an example of when progressive lawmakers took enormous—and often hard-won—strides in the ethically correct direction.

Discussion Questions for Teachers and Students:

Describe the party politics of the 1860s between the Radical Republicans and Southern Democrats. In what ways do today’s progressive and conservative parties reflect those of the 1860s? In what ways do they differ?
Do the opposing forces of progress and tradition—which can be found throughout world history—represent a conflict or a balance or both? Explain your answer.
What are other historical examples of conflicts between progressive and conservative sides? Consider other places and periods in history.

Recommended Reading:

The Federalist Papers
The Emancipation Proclamation
The Freedmen’s Bureau Bill

4. Presidents can be overpowered and overruled.

Following the end of the Civil War and the assassination of President Abraham Lincoln in 1865, Andrew Johnson took the presidential office. Unlike Lincoln, Johnson was a Southern Democrat and brought with him a highly conservative agenda. He attempted to reinstate the Southern state governments and resurrect the crushed postwar South. Furthermore, he tried to block, veto, and argue against the progressive laws passed by the predominantly Republican Congress.

Despite Johnson’s desire to return the United States to its prewar condition, the Republicans in Congress pushed for a better future, ignoring Johnson’s numerous vetoes in their march towards greater equality and civil rights. The best example of this trend is the Civil Rights Act of 1866, which Johnson had vetoed before the Senate and House of Representatives overrode the veto in a cascade of congressional votes. It was the first major veto override in American legislative history. Andrew Johnson’s term in office shows how American presidents can be overpowered and overruled, especially if they conflict with the stronger political forces of their time.

Discussion Questions for Teachers and Students:

How does a presidential veto work? Do you think the presidential veto is a proper tool in the system of checks and balances? Explain your answer.
What are some other times in American history when a president clashed with Congress? What happened?
Beyond American history, what are other examples of world leaders who tried to halt or slow the forces of change and progress? What happened?

Recommended Reading:

The Civil Rights Act of 1866, along with Johnson’s attempted veto of it
The Reconstruction Acts of 1867 and 1868
Frederick Douglass’s essay “Reconstruction”

_____________

We hope you find American history as fascinating and valuable as we do. In particular, the history of the Civil War and Reconstruction offers key insights into the current landscape of the United States, revealing important lessons about the workings of the government and the attainment of civil rights.

— Zachary, @ Owl Eyes
_____________

Zachary is an associate editor at OwlEyes.org, where he works with a talented team of fellow book nerds to make classic literature enriching and fun for teachers and students alike. Follow Owl Eyes on Twitter. 

The Dumbest Argument Against Independent Reading

I’m in my eighteenth year of teaching and I’ve set aside time for student self-selected reading every day for every one of those years. It is the most sacred item on my agenda. On one of those days where we have an assembly and a fire drill and a bee gets in the room and blows a ten-minute hole in my science lesson and I have to cut something, I never cut independent reading.

As a student teacher, I was fortunate enough to be placed with a mentor teacher who valued independent reading time as much as I do. But even back then, her principal looked at this 30 minutes as wasteful. When the administration adopted a new program and my mentor teacher was wondering where she would fit it in, the principal suggested she just get rid of that student reading time.

It’s been that way ever since. I have never had an administrator who offered a full-throated endorsement of independent reading. More often, it’s the opposite. I’ve sat in meetings where principals presented research showing independent reading wasn’t effective. (Not true, by the way. Read more here.) I’ve known teachers who were flat out told to end the practice. I’ve sat in meetings where an administrator’s minion (a “coach,” she was rather hilariously called) questioned its efficacy.

An aside: My sneaking suspicion is that administrators don’t like independent reading because teachers aren’t doing enough. This is where the criticism of Drop Everything And Read came from. Teachers, those valuable professionals who eat up the lion’s share of district budgets, shouldn’t be getting paid to sit around reading with their students when they could be teaching. It’s a belief that permeates the entire day. Although providing students feedback is a critical part of the learning process, most teachers I know wouldn’t be caught dead grading student work while students are in the room. Teachers are supposed to teach, every second of every day. And they’re supposed to do all that other teachery stuff during their prep time (good luck with that).

In fairness, some data does suggest that independent reading isn’t effective for our lowest readers. The reason independent reading doesn’t work for the lowest readers, the research has concluded, is because those students — wait for it — don’t use the time to read (or they attempt to read stuff that’s too hard, which is just another way of saying they don’t read). Those students, we are told, should be engaged with the teacher in direct instruction.

This is quite possibly the dumbest reason to stop doing something I have ever heard. I can think of no other thing we do inside the classroom or out of it where we would apply the same logic.

–Students who don’t pay attention to our lessons don’t learn as much, so we should stop teaching lessons.

— Students who don’t do their math assignments don’t learn as much math, so we should stop assigning math.

— Basketball players who refuse to try hard in practice don’t get any better, so we should pull them off to the side and coach them separately.

— Your daughter refuses to practice piano when you ask her to, so you should stop giving her time to practice.

Of course the kids who don’t read during independent reading time don’t get better at reading. That doesn’t mean we should stop doing it. It means we should figure out how to get kids to do it, just like we would for anything else we believe is beneficial.

–We don’t stop making our kids take baths because they don’t like them.

–We don’t tell our daughters, “Ah, the hell with it, just leave your room filthy,” because they don’t want to clean it.

–We don’t allow our sons to eat pancakes and pizza for dinner every night because they don’t like fruits and vegetables.

And we shouldn’t just shrug our shoulders when students don’t want to read. Nor should we pull them back and make them read to us. Reading to oneself is a life skill that has the potential to change futures.

Yes, we should teach reading lessons. We should intervene with kids who struggle. But we should also provide the time for kids to read whatever they want to themselves. We shouldn’t give up just because a handful would rather not.

–My mom got me to eat celery by slathering it with peanut butter.

–My dad got me to clean my room by threatening consequences if I didn’t.

–My third grade teacher got me to turn in my homework by announcing to the class which kids didn’t turn theirs in.

Get creative. Pull out all the stops. Get kids to read to themselves.

For some, that might mean helping them find books they’re interested in or guiding them toward books they can actually read. It might mean establishing a culture where kids don’t feel self-conscious about reading books at a lower level than their peers. It could even mean —gasp — consequences for not reading, just as there would be for kids who refuse to do their math, try hard at basketball practice, or clean their rooms. Experiment. Get creative. But don’t just give up. Independent reading is too important.

 

 

 

Higher Education: Transitioning From a Teacher to a Professor

The following is a guest post by Dixie Somers. 

Many institutions of higher learning require that newly hired professors have some experience in K-12. After entering the field, however, those individuals often struggle to walk the thin line between remembering their grade level experiences and developing lessons for adult students. The advantage in having such a background, however, is that new professors understand what future teachers will face once in the classroom. An effective transition can be accomplished by keeping a few things in mind.

Many Jobs Come with New Professorship

Most new hires come in as assistant professors. In that capacity, you will be expected to teach, conduct research, and provide various services to the institution you work for. You’ll most likely be teamed with a tenured professor who will help you navigate your first few years. The teaching component is usually composed of between two to four courses per semester. However, it should be remembered that it takes an enormous amount of preparation for each class.

The second job is research.  Institutions of higher learning depend on exposure, status, and reputation to attract quality students. They get those accolades through publications. Additionally, professors become tenured through their publications file.

The third job is service to the institution. That can come in the form of serving on committees, organizing conferences and lectures, and advising students. That job serves a dual purpose. Not only are you providing a much-needed service to the organization, but it also provides you with the opportunity to network with other staff.

Get to Know Your Department Early

To move into a tenured position, you’ll need a strong endorsement from your department. As a result, networking is extremely important. Each department will have its own culture and patterns of behavior that you’ll have to learn and adapt to. It’s important to remember that a political pervasiveness, which is different than at grade level schools, will permeate the department. That factor will require you to learn the nuances of the people and structure of the department so you don’t get sucked into the middle of disputes. The best way to get to know your new department is to attend all functions, whether formal or informal. During such events you will want to ask uncontroversial questions about things you’ve heard, then listen to the stories that will provide enlightenment. Most importantly, you will want to find ways to relieve stress while learning about your co-workers in a less constrained setting, such as the gym.

Change Your Perspective on Being an Education Professional

Transitioning from your position as a teacher to a professor of a college like Stevens Henager College can be a challenge. One of the key things to remember is that faculty members treat each other and students differently than they do in grade level schools. Social distance needs to be established between you and your students and they need to understand that you are not their peer or friend. There are two ways to establish this.

The first is with your dress. If you wear professional clothing then you’ll be treated as a professional. Another way is to establish your position by using your title of “Dr.” or “Professor.” You don’t have to appear as if you know it all. In fact, you’ll gain more respect if you say “I don’t know, but I’ll look it up before we meet again.” In many cases, it can be a great learning opportunity for the students by asking everyone to seek the answer in order to share what they found during the next class. Feel confident in the fact that, in your field, you are an expert.

Your future as a professional in higher education will ultimately depend on several things. Included are your teaching record, evaluations, publications, outside letters reporting on your standing in the field, and the record of your service. The upside is that you’ll have more freedom in academia than you ever thought possible.

 

Dixie Somers is a freelance writer and blogger for business, home, and family niches. Dixie lives in Phoenix, Arizona, and is the proud mother of three beautiful girls and wife to a wonderful husband.