Labour Against Tories - Advertising Wit
next stop
From the London Newspapers ...
Saturday November 15 2008
Full Text Link: http://www.guardian.co.uk/books/2008/nov/15/malcolm-gladwell-outliers-extract
The University of Michigan opened its new computer centre in 1971, in a low-slung building on Beal Avenue in Ann Arbor. The university's enormous mainframe computers stood in the middle of a vast, white-tiled room, looking, as one
faculty member remembers, "like one of the last scenes in 2001: A Space
Odyssey". Off to the side were dozens of key-punch machines - what
passed in those days for computer terminals. Over the years, thousands
of students would pass through that white-tiled room - the most famous
of whom was a gawky teenager named Bill Joy.
Joy came to the University of Michigan the year the computer centre opened, at the age of 16. He had been voted "most studious student" by his graduating
class at North Framingham high school, outside Detroit, which, as he
puts it, meant he was a "no-date nerd". He had thought he might end up
as a biologist or a mathematician, but late in his freshman year he
stumbled across the computing centre - and he was hooked.
From then on, the computer centre was his life. He programmed whenever he
could. He got a job with a computer science professor, so he could
program over the summer. In 1975, Joy enrolled in graduate school at
the University of California, Berkeley. There, he buried himself even
deeper in the world of computer software. During the oral exams for his
PhD, he made up a particularly complicated algorithm on the fly that -
as one of his many admirers has written - "so stunned his examiners
[that] one of them later compared the experience to 'Jesus confounding
his elders' ".
Working in collaboration with a small group of programmers, Joy took on the task of rewriting Unix, a software system developed by AT&T for mainframe computers. Joy's version was so good that it became - and remains - the operating system on which millions of computers around the world run. "If you put your Mac in
that funny mode where you can see the code," Joy says, "I see things
that I remember typing in 25 years ago." And when you go online, do you
know who wrote the software that allows you to access the internet?
Bill Joy.
After Berkeley, Joy co-founded the Silicon Valley firm Sun Microsystems. There, he rewrote another computer language, Java, and his legend grew still further. Among Silicon Valley insiders, Joy is spoken of with as much awe as Bill Gates. He is sometimes called the Edison of the internet.
The story of Joy's genius has been told many times, and the lesson is always the same. Here was a world that was the purest of meritocracies. Computer programming didn't operate as an old-boy network, where you got ahead because of money or connections. It was a wide-open field, in which all participants were
judged solely by their talent and accomplishments. It was a world where
the best men won, and Joy was clearly one of those best men.
Sport, too, is supposed to be just such a pure meritocracy. But is it? Take
ice hockey in Canada: look at any team and you will find that a
disproportionate number of players will have been born in the first
three months of the year. This, it turns out, is because the cut-off
date for children eligible for the nine-year-old, 10-year-old,
11-year-old league and so on is January 1. Boys who are oldest and
biggest at the beginning of the hockey season are inevitably the best.
And so they get the most coaching and practice, and they get chosen for
the all-star team, and so their advantage increases - on into the
professional game. A similar pattern applies to other sports. What we
think of as talent is actually a complicated combination of ability,
opportunity and utterly arbitrary advantage.
Does something similar apply to outliers in other fields, such as Bill Joy? Do they
benefit from special opportunities, and do those opportunities follow
any kind of pattern? The evidence suggests they do.
In the early 90s, the psychologist K Anders Ericsson and two colleagues set up shop
at Berlin's elite Academy of Music. With the help of the academy's
professors, they divided the school's violinists into three groups. The
first group were the stars, the students with the potential to become
world-class soloists. The second were those judged to be merely "good".
The third were students who were unlikely ever to play professionally,
and intended to be music teachers in the school system. All the
violinists were then asked the same question. Over the course of your
career, ever since you first picked up the violin, how many hours have
you practised?
Everyone, from all three groups, started playing at roughly the same time - around the age of five. In those first few years, everyone practised roughly the same amount - about two or three hours a week. But around the age of eight real differences started to emerge. The students who would end up as the best in their class began to practise more than everyone else: six hours a week by age nine,
eight by age 12, 16 a week by age 14, and up and up, until by the age
of 20 they were practising well over 30 hours a week. By the age of 20,
the elite performers had all totalled 10,000 hours of practice over the
course of their lives. The merely good students had totalled, by
contrast, 8,000 hours, and the future music teachers just over 4,000
hours.
The curious thing about Ericsson's study is that he and his colleagues couldn't find any "naturals" - musicians who could float effortlessly to the top while practising a fraction of the time that their peers did. Nor could they find "grinds", people who worked harder than everyone else and yet just didn't have what it takes to break into the top ranks. Their research suggested that once you have enough
ability to get into a top music school, the thing that distinguishes
one performer from another is how hard he or she works. That's it.
What's more, the people at the very top don't just work much harder
than everyone else. They work much, much harder.
This idea - that excellence at a complex task requires a critical, minimum level of
practice - surfaces again and again in studies of expertise. In fact,
researchers have settled on what they believe is a magic number for
true expertise: 10,000 hours.
"In study after study, of composers, basketball players, fiction writers, ice-skaters, concert pianists, chess players, master criminals," writes the neurologist
Daniel Levitin, "this number comes up again and again. Ten thousand
hours is equivalent to roughly three hours a day, or 20 hours a week,
of practice over 10 years... No one has yet found a case in which true
world-class expertise was accomplished in less time. It seems that it
takes the brain this long to assimilate all that it needs to know to
achieve true mastery."
This is true even of people we think of as prodigies. Mozart, for example, famously started writing music at six. But, the psychologist Michael Howe writes in his book Genius Explained, by the standards of mature composers Mozart's early works are not outstanding. The earliest pieces were all probably written down by his
father, and perhaps improved in the process. Many of Wolfgang's
childhood compositions, such as the first seven of his concertos for
piano and orchestra, are largely arrangements of works by other
composers. Of those concertos that contain only music original to
Mozart, the earliest that is now regarded as a masterwork (No9 K271)
was not composed until he was 21: by that time Mozart had already been
composing concertos for 10 years.
To become a chess grandmaster also seems to take about 10 years. (Only the legendary Bobby Fischer got to that elite level in less than that time: it took him nine
years.) And what's 10 years? Well, it's roughly how long it takes to
put in 10,000 hours of hard practice.
Ten thousand hours is, of course, an enormous amount of time. It's all but impossible to reach that number, by the time you're a young adult, all by yourself. You
have to have parents who are encouraging and supportive. You can't be
poor, because if you have to hold down a part-time job on the side to
help make ends meet, there won't be enough time left over in the day.
In fact, most people can really only reach that number if they get into
some kind of special programme - like a hockey all-star squad - or get
some kind of extraordinary opportunity that gives them a chance to put
in that kind of work.
So, back to Bill Joy. It's 1971 and he's 16. He's the maths wiz, the kind of student that schools like MIT, Caltech or the University of Waterloo attract by the hundreds. "When Bill was a little kid, he wanted to know everything about everything
way before he should've even known he wanted to know," his father
William says. "We answered him when we could. And when we couldn't, we
would just give him a book." When he applied to college, Joy got a
perfect score on the maths portion of the scholastic aptitude test. "It
wasn't particularly hard," he says, matter-of-factly. "There was plenty
of time to check it twice." He could have gone in any number of
directions. He could have done a PhD in biology. He could have gone to
medical school. He could easily have had a "typical" college career:
lots of schoolwork, football games, drunken fraternity parties, awkward
encounters with girls, long discussions with roommates about the
meaning of life. But he didn't, because he stumbled across that
nondescript building on Beal Avenue.
In the 70s, when Joy was learning about programming, computers were the size of rooms. A single machine - which might have less power and memory than your microwave - could cost upwards of a million dollars. Computers were hard to get
access to, and renting time on them cost a fortune. This was the era
when computer programs were created using cardboard "punch" cards. A
complex program might include hundreds, if not thousands, of these
cards, in tall stacks. Since computers could handle only one task at a
time, the operator made an appointment for your program and, depending
on how many other people were ahead of you in line, you might not get
your cards back for several hours. And if you made even a single error
in your program, then you had to take the cards back, track down the
error and begin the whole process again. Under those circumstances, it
was exceedingly difficult for anyone to become a programming expert.
Certainly becoming an expert by your early 20s was all but impossible.
"Programming with cards," one computer scientist from the era
remembers, "did not teach you programming. It taught you patience and
proofreading."
That's where the University of Michigan came in. It was one of the first universities in the world to abandon computer cards for the brand-new system called "time-sharing". Computer scientists realised you could train a computer to handle hundreds of tasks at the same time. No more punch cards. You could build dozens of
terminals, link them all to the mainframe by a telephone line, and have
everyone programming - online - all at once.
This was the opportunity that greeted Bill Joy when he arrived on the Ann Arbor
campus in the autumn of 1971. "Do you know what the difference is
between the computing cards and time-sharing?" Joy says. "It's the
difference between playing chess by mail and speed chess." Programming
wasn't an exercise in frustration any more. It was fun.
According to Joy, he spent a phenomenal amount of time at the computer centre.
"It was open 24 hours. I would stay there all night, and just walk home
in the morning. In an average week in those years I was spending more
time in the computer centre than on my classes. All of us down there
had this recurring nightmare of forgetting to show up for class at all,
of not even realising we were enrolled."
Just look at the stream of opportunities that came Joy's way. Because he happened to go to a far-sighted school, he was able to practise on a time-sharing
system, instead of punch cards; because the university was willing to
spend the money to keep the computer centre open 24 hours, he could
stay up all night; and because he was able to put in so many hours, by
the time he was presented with the opportunity to rewrite Unix, he was
up to the task. Bill Joy was brilliant. He wanted to learn - that was a
big part of it - but before he could become an expert, someone had to
give him the opportunity to learn how to be expert.
"At Michigan, I was probably programming eight or 10 hours a day," he says. "By the
time I was at Berkeley, I was doing it day and night... " He pauses for
a moment, to do the maths in his head which, for him, doesn't take
long. "It's five years," he says, finally. "So, so, maybe... 10,000
hours? That's about right."
Is this a general rule of success? If you scratch below the surface of every great achiever, do you always find the equivalent of the Michigan Computer Centre or the hockey all-star team - some sort of special opportunity for practice? Let's
test the idea with two examples: the Beatles, one of the most famous
rock bands ever, and Bill Gates, one of the world's richest men.
The Beatles - John Lennon, Paul McCartney, George Harrison and Ringo Starr
- came to the US in February 1964, starting the so-called "British
Invasion" of the American music scene. The interesting thing is how
long they had already been playing together. Lennon and McCartney began
in 1957. (Incidentally, the time that elapsed between their founding
and their greatest artistic achievements - arguably Sgt Pepper's Lonely
Hearts Club Band and the White Album - is 10 years.) In 1960, while
they were still a struggling school rock band, they were invited to
play in Hamburg, Germany.
"Hamburg in those days did not have rock'n'roll music clubs. It had strip clubs," says Philip Norman, who wrote the Beatles' biography, Shout! "There was one particular club owner called Bruno, who was originally a fairground showman. He had the
idea of bringing in rock groups to play in various clubs. They had this
formula. It was a huge nonstop show, hour after hour, with a lot of
people lurching in and the other lot lurching out. And the bands would
play all the time to catch the passing traffic. In an American
red-light district, they would call it nonstop striptease.
"Many of the bands that played in Hamburg were from Liverpool," Norman
continues. "It was an accident. Bruno went to London to look for bands.
But he happened to meet a Liverpool entrepreneur in Soho, who was down
in London by pure chance. And he arranged to send some bands over.
That's how the connection was established. And eventually the Beatles
made a connection not just with Bruno, but with other club owners as
well. They kept going back, because they got a lot of alcohol and a lot
of sex."
And what was so special about Hamburg? It wasn't that it paid well. (It didn't.) Or that the acoustics were fantastic. (They weren't.) Or that the audiences were savvy and appreciative. (They were anything but.) It was the sheer amount of time the band was forced to play. Here is John Lennon, in an interview after the Beatles disbanded, talking about the band's performances at a Hamburg strip club called
the Indra: "We got better and got more confidence. We couldn't help it
with all the experience playing all night long. It was handy them being
foreign. We had to try even harder, put our heart and soul into it, to
get ourselves over. In Liverpool, we'd only ever done one-hour
sessions, and we just used to do our best numbers, the same ones, at
every one. In Hamburg we had to play for eight hours, so we really had
to find a new way of playing."
The Beatles ended up travelling to Hamburg five times between 1960 and the end of 1962. On the first trip, they played 106 nights, of five or more hours a night. Their second trip they played 92 times. Their third trip they played 48 times, for a
total of 172 hours on stage. The last two Hamburg stints, in November
and December 1962, involved another 90 hours of performing. All told,
they performed for 270 nights in just over a year and a half. By the
time they had their first burst of success in 1964, they had performed
live an estimated 1,200 times, which is extraordinary. Most bands today
don't perform 1,200 times in their entire careers. The Hamburg crucible
is what set the Beatles apart.
"They were no good on stage when they went there and they were very good when they came back," Norman says. "They learned not only stamina, they had to learn an enormous amount of numbers - cover versions of everything you can think of, not
just rock'n'roll, a bit of jazz, too. They weren't disciplined on stage
at all before that. But when they came back they sounded like no one
else. It was the making of them."
Let's now turn to the history of Bill Gates. His story is almost as well-known as the Beatles'. Brilliant young maths wiz discovers computer programming. Drops out of
Harvard. Starts a little computer company called Microsoft with his
friends. Through sheer brilliance, ambition and guts builds it into the
giant of the software world.
Now let's dig a bit deeper. Gates' father was a wealthy lawyer in Seattle, and his mother was the daughter of a well-to-do banker. As a child Gates was precocious, and easily bored by his studies. So his parents took him out of public school, and
at the beginning of seventh grade sent him to Lakeside, a private
school that catered to Seattle's elite families. Midway through Gates'
second year, the school started a computer club. "The Mothers' Club at
school did a rummage sale every year, and there was always the question
of what the money would go to," Gates remembers. "That year, they put
$3,000 into buying a computer terminal down in this funny little room
that we subsequently took control of. It was kind of an amazing thing."
Even more remarkable was the kind of computer Lakeside bought:
it was an ASR-33 Teletype, a time-sharing terminal with a direct link
to a mainframe computer in downtown Seattle. "The whole idea of
time-sharing only got invented in 1965," Gates says. "Someone was
pretty forward looking."
From that moment on, Gates lived in the computer room. He and a number of others began to teach themselves how to use this strange new device. The parents raised more money to buy time on the mainframe computer. The students spent it. As luck
would have it, Monique Rona, one of the founders of C-Cubed - a company
that leased computer time - had a son at Lakeside, a class ahead of
Gates. Would the Lakeside computer club, Rona wondered, like to test
out the company's software programs on the weekends in exchange for
free programming time? Absolutely!
Before long, Gates and his friends latched on to another outfit called ISI, which agreed to let them have free computer time in exchange for working on a piece of
software that could be used to automate company payrolls. In one
seven-month period in 1971, Gates and his cohorts ran up 1,575 hours of
computer time on the ISI mainframe, which averages out at eight hours a
day, seven days a week.
"It was my obsession," Gates says of his early high school years. "I skipped athletics. I went up there at night. We were programming on weekends. It would be a rare week that we wouldn't get 20 or 30 hours in. There was a period where Paul Allen and I got in trouble for stealing a bunch of passwords and crashing the
system. We got kicked out. I didn't get to use the computer the whole
summer. This is when I was 15 and 16. Then I found out Paul had found a
computer that was free at the University of Washington. They had these
machines in the medical centre and the physics department. They were on
a 24-hour schedule, but with this big slack period so between three and
six in the morning they never scheduled anything." Gates laughed.
"That's why I'm always so generous to the University of Washington,
because they let me steal so much computer time. I'd leave at night,
after my bedtime. I could walk up to the university from my house. Or
I'd take the bus." Years later, Gates' mother said, "We always wondered
why it was so hard for him to get up in the morning."
Through one of the founders of ISI, Gates landed a secondment programming a
computer system at the Bonneville Power station in southern Washington
State. There, he spent the spring of his senior year writing code.
Those five years, from eighth grade to the end of high school, were Bill
Gates' Hamburg, and by any measure he was presented with an even more
extraordinary series of opportunities than Bill Joy. And virtually
every one of those opportunities gave Gates extra time to practise. By
the time he dropped out of Harvard, he'd been programming nonstop for
seven consecutive years. He was way past 10,000 hours. How many
teenagers had the kind of experience Gates had? "If there were 50 in
the world, I'd be stunned," he says.
If you put together the stories of hockey players and the Beatles and Bill Joy and Bill Gates, I think we get a more complete picture of the path to success. Joy,
Gates and the Beatles are all undeniably talented. Lennon and McCartney
had a musical gift, of the sort that comes along once in a generation,
and Joy, let us not forget, had a mind so quick that he could make up a
complicated algorithm on the fly that left his professors in awe. A
good part of that "talent", however, was something other than an innate
aptitude for music or maths. It was desire. The Beatles were willing to
play for eight hours straight, seven days a week. Joy was willing to
stay up all night programming. In either case, most of us would have
gone home to bed. In other words, a key part of what it means to be
talented is being able to practise for hours and hours - to the point
where it is really hard to know where "natural ability" stops and the
simple willingness to work hard begins.
What is so striking about these success stories is that the outliers were the beneficiaries of some kind of unusual opportunity. Lucky breaks don't seem like the
exception with software billionaires, rock bands and star athletes;
they seem like the rule.
Recently Forbes Magazine compiled a list of the 75 richest people in history. It includes queens and kings and pharaohs from centuries past, as well as contemporary billionaires such as Warren Buffet and Carlos Slim. However, an astonishing 14 on the list are Americans born within nine years of each other in the mid-19th century. In other words, almost 20% of the names come from a single
generation - born between 1831 and 1840 in a single country. The list
includes industrialists and financiers who are still household names
today: John Rockefeller, born in 1839 (the richest of the lot); Andrew
Carnegie, 1835; Jay Gould, 1836; and JP Morgan, 1837.
What's going on here is obvious, if you think about it. In the 1860s and
1870s, the American economy went through perhaps the greatest
transformation in its history. This was when the railways were built,
and when Wall Street emerged. It was when industrial manufacturing
started in earnest. It was when all the rules by which the traditional
economy functioned were broken and remade. What that list says is that
it was absolutely critical, if you were going to take advantage of
those opportunities, to be in your 20s when that transformation was
happening.
If you were born in the late 1840s, you missed it - you were too young to take advantage of that moment. If you were born in the 1820s, you were too old - your mindset was shaped by the old, pre-civil war ways. But there is a particular, narrow nine-year window that was just perfect. All of the 14 men and women on that list had
vision and talent. But they also were given an extraordinary
opportunity, in the same way that hockey players born in January,
February and March were given an extraordinary opportunity.
Let's do the same kind of analysis for software tycoons such as Bill Joy and Bill Gates.
Veterans of Silicon Valley will tell you that the most important date in the
history of the personal computer revolution was January 1975. That was
when the magazine Popular Electronics ran a cover story on a machine
called the Altair 8800. The Altair cost $397. It was a do-it-yourself
contraption that you could assemble at home. The headline on the story
read: Project Breakthrough! World's First Minicomputer Kit To Rival
Commercial Models. To readers of Popular Electronics, then the bible of
the fledgling software and computer world, that headline was a
revelation. Computers up to that point were the massive, expensive
mainframes of the sort sitting in the white-tiled expanse of the
Michigan computing centre. For years, every hacker and electronics wiz
had dreamed of the day when a computer would come along that was small
and inexpensive enough for an ordinary person to use and own. That day
had finally arrived.
If January 1975 was the dawn of the personal computer age, then who would be in the best position to take advantage of it? If you're a few years out of college in 1975, and if you have had any experience with programming at all, you would have
already been hired by IBM or one of the other traditional, old-line
computer firms of that era. You belonged to the old paradigm. You have
just bought a house. You're married. A baby is on the way. You're in no
position to give up a good job and pension for some pie-in-the-sky $397
computer kit. So let's also rule out all those born before, say, 1952.
At the same time, though, you don't want to be too young. You can't seize
the moment if you're still in high school. So let's also rule out
anyone born after, say, 1958. The perfect age to be in 1975, in other
words, is young enough to see the coming revolution but not so old as
to have missed it. You want to be 20 or 21, born in 1954 or 1955.
Let's start with Gates, the richest and most famous of all Silicon Valley
tycoons. When was he born? Bill Gates: October 28 1955. The perfect
birthdate. Gates is the hockey player born on January 1.
Gates's best friend at Lakeside was Paul Allen. He also hung out in the
computer room with Gates, and shared those long evenings at ISI and
C-Cubed. Allen went on to found Microsoft with Gates. Paul Allen:
January 21 1953.
The third richest man at Microsoft is the one who has been running the company on a day-to-day basis since 2000 - one of the most respected executives in the software world, Steve Ballmer. Steve Ballmer: March 24 1956.
And let's not forget a man every bit as famous as Gates, Steve Jobs, the co-founder of Apple Computer. He wasn't from a rich family, like Gates, and he didn't go to Michigan, like Joy. But it doesn't take much investigation of his upbringing to
realise that he had his Hamburg, too. He grew up in Mountain View
California, just south of San Francisco, which is the absolute
epicentre of Silicon Valley. His neighbourhood was filled with
engineers from Hewlett-Packard, then, as now, one of the most important
electronics firms in the world. As a teenager he prowled the flea
markets of Mountain View, where electronics hobbyists and tinkerers
sold spare parts. Jobs came of age breathing the air of the very
business he would later dominate. He picked the brains of
Hewlett-Packard engineers and once even called Bill Hewlett, one of the
company's founders, to request parts. Jobs not only received the parts
he wanted, he managed to wangle a summer job. He worked on an assembly
line to build computers and was so fascinated that he tried to design
his own... Steve Jobs was born on February 24 1955.
Another of the pioneers of the software revolution was Eric Schmidt. He ran
Novell, one of Silicon Valley's most important software firms, and in
2001 became the chief executive officer of Google. He was born on April
27 1955.
I don't mean to suggest, of course, that every software tycoon in Silicon Valley was born in 1955. But there are very clearly patterns here, and what's striking is how little we seem to want to talk about them. We pretend that success is a matter of individual merit. That is not the whole story. These are stories about people who
were given a special opportunity to work really hard and seized it, and
who happened to come of age at a time when that extraordinary effort
was rewarded by the rest of society. Their success was not of their own
making. It was a product of the world in which they grew up. Their
success, in other words, wasn't due to some mysterious process known
only to themselves. It had a logic, and if we can understand that
logic, think of all the tantalising possibilities that opens up.
By the way, let's not forget Bill Joy. Had he been just a little bit older
and had to face the drudgery of programming with computer cards, he
says he would have studied science. Bill Joy the computer legend would
have been Bill Joy the biologist. In fact, he was born on November 8
1954. And his three fellow founders of Sun Microsystems - one of the
oldest and most important of Silicon Valley's software companies? Scott
McNealy: born November 13 1954. Vinod Khosla: born January 28 1955.
Andy Bechtolsheim: born June 1955. ·
© Malcolm Gladwell 2008.
• This is an edited extract from Outliers: The Story Of Success, by
Malcolm Gladwell, to be published on November 27 by Allen Lane at
£16.99. Malcolm Gladwell: Live In London is on November 24 at 5.45pm
and 8.30pm at the Lyceum Theatre, London. Tickets from £13.50 to
£26.50. To book, call 0844 412 1742 or go to malcolmgladwell-live.com. There will be an interview with Malcolm Gladwell in tomorrow's Observer.
By Nicola Woolcock Education Correspondent
How would you respond if your child refused to eat her vegetables with the words: “Mummy, I feel very uncomfortable having to eat all these peas”. ?
SEAL has the enthusiastic support of ministers, who are currently exploring
whether pupils should be assessed at school on their personal development as well as their academic achievements.
As a mark in how far this approach to learning has gone already, schools in
Birmingham were told earlier this year that happiness in the classroom
should be treated with the same importance as academic achievement.
But Professor Hayes does not approve. Indeed, he believes that teaching
emotional lessons in school, gets in the way of learning and represents
a form of child abuse that manipulates pupils into being victims.
He told a recent gathering of educationalists in London organised by the
Westminster Education Forum that schools are in danger of becoming
“social work centres staffed by psychiatrists brainwashing pupils”.
Millions of pounds, he says, are being spent on protecting children from
bullying, teaching them to respect others and coaching them in “proper
emotions”, such as empathy not anger.
“One mother told me that her son had learnt the ‘dealing with potentially
abusive situation’ scripts so well that at dinner he said, ‘Mummy, I
feel very uncomfortable having to eat all these peas’.”
Professor Hayes, who is co-author of a book called The Dangerous Rise of Therapeutic Education, believes that such tactics exacerbate problems by making children oversensitive.
And it was. An era of unbridled deregulation, wealth-enhancing perks for the already well-off, and miserly indifference to the poor and middle class; of the recasting of greed as goodness, the equation of bellicose provincialism with patriotism, the reframing of bigotry as small-town decency.
In short, it was the start of our current era. The Reagan Revolution was the formative political experience of my generation’s lifetime,like the Great Depression, the Second World War or Vietnam for those before us. And in its intellectual and moral paucity, in its eventual hegemony, these years shut down, for some of us, the ability to fully imagine another way.I will admit that back in January, when Barack Obama, in his post-Iowa victory speech, spoke about the “cynics,” the “they” who said “this country was too divided, too disillusioned to ever come together around a common purpose,” he was talking about me.
I will admit that the call of “change” did not speak to me as an achievable goal.Until it actually came.
On Wednesday, there was a run on newspapers, as voters rushed to grab a tangible piece of the history they’d made. My husband Max and I,unable to find extra copies, brought our own worn papers home to 8- and 11-year-old Emilie and Julia.Sept. 11, the seismic event that we’d feared would forever form their political consciousness, shaping their world and constricting the boundaries of the possible, had actually been eclipsed, light blotting out darkness, the best of America at long last driving away the demons of fear. We wanted them to see that it was the end of an era.
“Look,” we said, pointing to the headline “Racial Barrier Falls.” “This is huge.”We labored to make them understand that their world – art that day,
and orchestra, and Baked Potato Bar at lunch – had irrevocably changed.
They were happy because we were happy. They rose to the occasion in that bemused way children do when adults tell them what they should feel. They were glad to be rid of George W. Bush and to be saved – for now – from the specter of Sarah Palin. (“It is not O.K. to say she’s an ‘idiot,’” I had snapped when they came home from school stoked by the mob. “Prove your case. Show, don’t tell.”)
They’d had, like many D.C. children, more than their share of politics. After first following the country into battle against the all-purpose boogeyman Saddam Hussein, they’d become antiwar. They had opinions on tax policy and spoke angrily about the “wealth gap.” In the past election year, they’d been fired up about the woman thing, in all its pretty girl versus smart girl iterations; in fact, they and their friends had remained hard-core Hillaryites long after their moms had moved on.But the race thing? The groundbreaking enormity of the election of our country’s first African-American president?
“You’re being racist,” Emilie had said when I made a comment about how particularly earth-moving this election was for black voters. “Why should it matter if people are black or white?”Theirs has often looked to me like a world drained of meaning. Girl power put to the service of selling Hannah Montana. Feel-good inclusiveness that occulted the very real conflicts, crimes and hatreds of history.
It isn’t easy to let go of the past to embrace something new, to risk heartbreak on the chance of the world’s actually having changed.Or at least, it hasn’t been easy for me. But it comes naturally to some. Like the hundreds of George Washington University students who gathered in front of the White House on Tuesday night, cheering and screaming and shouting their goodbyes to the political era of their youth.
“Bliss it was to be alive, but to be young was very heaven,” Max emailed me, paraphrasing William Wordsworth on the French Revolution, at 11:30 p.m. on election night, after leaving his desk to walk among the revelers downtown. I, home with the kids, was in bed, sleeping the drugged sleep of an alcohol-abstaining migraineuse after drinking half a glass of celebratory champagne.
Colin Powell did not dance for joy over Obama’s victory; he wept.“Look what we did. Look what we did,” he said, puffy-faced, red-eyed, fighting back more tears on CNN. “He’s won. It’s over.”
David Dinkins was similarly solemn. “Things do change. There is a God. They do get better,” said the mayor who presided over New York City at a time of toxic racial tensions.Obama, too, resisted giddy gladness on Tuesday night. But he did proclaim an end to the world as we’ve known it for far too long.
“To those who would tear the world down: we will defeat you,” he promised. “This is our moment. This is our time.”The glory of Barack Obama is that there are so many different kinds of us who can claim a piece of that “our.” African-Americans, Democrats, post-boomers, progressives, people who rose from essentially nowhere and through hard work and determination succeeded beyond their parents’ wildest dreams are the most obvious.
But there are also people who respect intelligence and good grammar. People who see their spouse as their “best friend,” as Barack called Michelle on Tuesday night. People whose children have the same knowing look as Sasha and Malia, who are probably more excited about their puppy than about their father’s presidency.Two images will forever stay in my mind to mark this epoch-breaking election day. One is that of Jesse Jackson’s face, drenched in tears, in Chicago’s Grant Park on Tuesday evening.
And the other is a photo that ran in The Times on Wednesday. In it, a black mother and daughter sit on the floor of a church in Harlem. The mother, Latrice Barnes, having heard of Obama’s victory, is doubled up in tears; her daughter, Jasmine, is reaching a tentative hand up to soothe her. To me, she looks like the future, reaching out to heal the past.It is, I suppose, in part a matter of temperament, whether one shouts or weeps at happy transformative moments. But I also think it’s a matter of what has come before. The young people joyfully frolicking in front of the Bush White House never knew the universe whose passing was marked by Obama’s victory and Jackson’s tears.
This moment of triumph marks the end of such a long period of pain, of indignity and injustice for African-Americans. And for so many others of us, of the trampling and debasing of our most basic ideals, beliefs that we cherished every bit as deeply and passionately as those of the “values voters” around whose sensibilities we’ve had to tiptoe for the past 28 years.The election brought the return of a country we’d lost for so long that it was almost forgotten under the accumulated scar tissue of accommodation and acceptance.
For me, this will be the enduring memory of election night 2008: One generation released its grief. The next looked up confusedly, eager to please and yet unable to comprehend just what the tears were about.
By Judith Warner, in The New York Times Blogs
http://warner.blogs.nytimes.com/2008/11/06/title/