blackbirdonline journalFall 2012 Vol. 11 No. 2
poetryfictionnonfictiongalleryfeaturesbrowse
print icon
HAL CROWTHER

Out of Date: The Joys of Obsolescence

The word itself looks weird and ancient, like something indecipherable scrawled on a cave wall or half-eroded from a decrepit tombstone. The word is “crwth,” a Welsh word—in English, where vowels rule, the word became “crowd.” The Irish version is “cruit.” The musical instrument that goes by these names is as archaic as the lovely Welsh language, and more obsolete. It’s a six-string bowed harp or lyre, something like a fiddle until you hear it, that has been traced back to eleventh century Byzantium. The crwth has been played very little, if at all, since the seventeenth century; its history is so obscure that the principal authority cited by Wikipedia is Musical and Poetical Relicks of the Welsh Bards, published by Edward Jones in 1784. If you Google “crwth,” you’ll find this sentence, which more or less defines obsolescence: “Since the art of crwth-playing died out so completely, and since it was an instrument of the folk culture rather than part of the academic musical world, the exact manner—if, indeed, there ever was one exact manner—in which the instrument was traditionally played, like the tunings employed, will probably never be known for certain.”

A modest revival of interest in the crwth, part of the original instruments movement, is based largely on conjecture and experimentation. Elsewhere on the internet you can find a video of Sean Folsom—a rubicund American musician who bills himself as Sean the Piper—wearing a Renaissance gown and one of those flat guildsman’s hats that looks like a meat pie with a velvet ribbon, solemnly playing and patiently explaining the crwth. Its sound is fetching in a way, at least to me, but distinctly . . . medieval. At this point I confess what may be obvious, that my surname, Crowther, means crwth-player. Crwthist? I’m descended, at some point lost in time, from a musician who played songs now forgotten (“The repertoire of surviving crwth tunes is very small”) on an instrument no one now alive knows for certain how to play. Perhaps generations of these forlorn bards roosted in my family tree. People with surnames like Miller or Shoemaker, or the even more antiquated Cooper or Fletcher, remind us in a similar way of lost trades and traditions. But the Crowthers, as I see it, boast the most obsolete pedigree of all. Even the Flintstones seemed less out of date.

As a senior citizen with arthritis and cataracts, peering myopically down the smoking barrel of the twenty-first century, I know that bearing an expired name and representing an extinguished heritage has helped me to understand my life and accept the arc of my own fortunes. I’m no stranger to obsolescence. For twenty-five years I earned a living with an instrument now consigned to an oblivion even more complete than that of the crwth, because there will be no attempt to resurrect the typewriter. Yet this humble machine produced a comforting music of its own. Each year, inevitably and sadly, there are fewer of us who remember the companionable percussion of the Royal, the Olivetti, the Smith-Corona. The atonal but almost syncopated symphony of a big-city newsroom with its clicks and taps and ringing carriages, its tempo as varied as the speed and vigor of the fifty typists, was a sweet sound that will never be heard again on this planet, except in the fitful dreams of old reporters.

Grim electronics and grimmer economics brought an end to all that. But it’s idle nostalgia to grieve for the tools of yesteryear. Technology has its way with every profession, more often for the best. Lumberjacks with chainsaws rarely yearn for those crosscut saws their fathers strained to pull; modern fishermen don’t dream of hauling nets by hand. It may be sheer coincidence that the rapid disappearance of the typewriter and the rapid decline of journalism seem to coincide. That journalism has declined woefully is a very easy argument to document and sustain, but not without offending sincere individuals whose livelihood depends on its survival, in some as yet undetermined form.

I don’t blame young people for tuning out when a bitter graybeard evokes the good old days. Right or wrong, prescient or senescent, it’s an unbecoming role he’s playing, one I’d rather avoid. Mine is a personal, not a cultural, lament. Anyone who hasn’t registered the alarming divorce of broadcasting and professional journalism is simply very young. Yes, I can remember when foreign bureaus and documentaries—and thoughtful objectivity—were the pride of network news departments, and the cable news Punch-and-Judy was less than an embryo. As for print journalism, which seems to be following the Smith-Corona into oblivion, volumes have been published, and will be published (for your Kindle?) to explain what happened to us and why. I have little to add to a front-page story by New York Times media columnist David Carr, who chronicled the pitiful disintegration of the once proud and powerful Tribune Company, publisher of the Chicago Tribune and the Los Angeles Times.

Carr’s story, ironically published during national Newspaper Week, acquainted us with Tribune CEO Randy Michaels, a former radio shock jock assigned by billionaire Sam Zell to manage his controlling interest in the failing media giant. According to Carr, Michaels demoralized company headquarters in Chicago with a bizarre infusion of trash-radio culture—misogyny, sexual harassment, profane tirades, raunchy schoolboy humor, sleazy giveaways—that Tribune veterans could scarcely believe, far less tolerate. Carr reported that Michaels introduced himself to his new colleagues at a Chicago hotel bar by getting drunk and offering a waitress one hundred dollars to show him her breasts. “I have never seen anything like it,” recalled one eyewitness, no longer a Tribune employee. Carr also claimed that Michaels and an executive cadre of his old radio cronies looted the bankrupt Tribune by paying themselves huge bonuses while the company staggered ever deeper into debt.

Michaels—forced to resign a few weeks later—is a short, fat, bottom-fed individual who looks like the carny selling cheese fries at the county fair. To call him a scumbag is a description, not an insult, since a shock jock is a professional scumbag—it’s his stock in trade. Anyone who’s never heard one of these baboons castrate a pig in the studio or audiotape a sex act in St. Patrick’s Cathedral (both true) is a lucky American indeed. It’s no secret that radio is a latrine; we wince because this repulsive huckster hired and fired newsroom personnel for two legendary newspapers that were, within recent memory, among America’s most powerful and prestigious. Actually Michaels fired, mostly. More than four thousand Tribune Company employees have been terminated since he and Zell took over in 2008. For newspapers, the hour is very late. For newsmagazines, perhaps even later. Newsweek, where I once labored, died a cruel death in the marketplace. It was purchased for a dollar by a ninety-one-year-old stereo tycoon, now deceased, who merged it with The Daily Beast, a web magazine run by glitter-monger Tina Brown. (“Beastweek?” one reporter speculated.) Time magazine, another employer from my long-ago youth, has suffered similar financial and artistic decay and hangs like a decomposing albatross around Time-Warner’s tired neck.

The news about the news is uniformly depressing. If you’re tired of hearing that from pundits and disgruntled old newshounds, take it straight from the Kansas Department of Education, which has cut off funding for high school journalism courses after a review of labor-market data. As reported in Newsweek, “the state deemed journalism a dying industry unfit for public funds, which are meant for ‘high-demand, high-skill or high-wage’ jobs.” Ouch. It’s only Kansas, but it stings. To make the rout more poignant, the first jobs eliminated by the stressed-out, stripped-to-survive print media were the very jobs I used to work. Columnists, book editors, film, drama and fine arts critics—along with investigative reporters, foreign and Washington correspondents and editorial cartoonists—were dismissed as highbrow luxuries for newsrooms trying to get a grip on grassroots America. If I should have the good luck to live another twenty years, I suppose my résumé will provoke as much bewilderment among my grandchildren as if I’d told them I was an itinerant crwth-player. They’ll nod and give me a cup of something warm and pat me on my trembling shoulder. (“He said he ‘reviewed’ books and films for newspapers. Yesterday he muttered something about ‘Pogo’ and ‘Lil’ Abner.’ Has mama checked his medications?”)

But that’s in 2030, when we’ll all be living underground to escape the heat, and half a billion Americans will own three billion guns. (Not much hunting underground, but lots of stress to trigger domestic disputes.) My guess is that the information revolution will not be a major issue in 2030. I’m trying to process my obsolescence here in 2011, in my first year of eligibility for Medicare. The shock of having spent my professional life in “a dying industry” isn’t necessarily the most traumatic assault on my sense of self. No one who wasn’t in the work force—or in the world—in 1985 can possibly comprehend the speed or the magnitude of the technological metamorphosis we have just witnessed. Rip van Winkle would have had to sleep two hundred years to wake to the future shock you’d experience if you’d only been napping since the Reagan administration. Those of us who were at mid-life or beyond when the cyber-tsunami struck faced adjustment pressures unlike any in human history.

How did I cope? There’s an analogy that appeals to me. Here in Maine where I’m writing, we’re menaced by a pack of homicidal drivers, usually males under thirty in big new trucks, who cruise narrow two-lane roads at NASCAR speeds, day or night. If you drive at conventional speed, the cowboy comes up behind you so fast the hairs stand up on your neck. If there’s no sane chance for him to pass you, as is often the case on these roads, you have two choices: You can speed up dangerously to increase the distance between your bumper and his, or you can signal a right turn and pull over on the shoulder to let him streak by. The first few times I saw this second maneuver, I wondered whether it represented courtesy or terror. But it is, of course, the right move, even if it goes against your grain. And it’s the move I chose, a few years ago, to accommodate the alien technology that came roaring up in my rearview mirror at one hundred miles per hour. I didn’t hit the accelerator, though by nature I’m more combative than acquiescent. I pulled off the road and let the monster roll on by. Just where it’s going, no one knows and I don’t care.

Once you pull over, you forever wear the big O for obsolete. You can’t hide it. The wired world snickers, your children roll their eyes, preadolescents with fists full of gadgets regard you with wonder, with pity. You become selectively illiterate, because the technology constantly spawns new words and acronyms you have no need or wish to learn. How painful is this excommunication, this life in the ghetto of the left-behind? Personally I like it here. The company and conversation are first-rate, even if most of the neighbors are drawing Social Security. The experience of dropping out of the tech rat race reminds me of a line in one of my wife’s best novels, about a mountain girl in her teens who finds herself pregnant and alone, “ruint” for good according to her parents and her community. A good reputation and a good marriage are now out of reach, but she finds it mysteriously liberating. “When you’re ruint,” Ivy says, “It frees you up some.”

When you’re out of date and committed to it, it frees you up some. I honestly doubt that I’ll live to regret it. The parade goes by, and it can be highly entertaining as long as you don’t have to march, to learn the cadence and keep up the pace. You pick a choice seat on the reviewing stand and watch, unencumbered by performance anxiety, status or public opinion. You don’t count anymore, as the marchers reckon it, and as Janis Joplin once sang in “Me and Bobby McGee,” “Freedom’s just another word for nothin’ left to lose.”

Loneliness and self-pity aren’t major problems when your obsolescence is more or less voluntary. Smugness—unearned self-congratulation—is more of a threat. I hate to sound smug. I realize I’m very fortunate—blessed—to have had the option to say no. There are talented journalists my age and even older, discarded by the dying industry, who are blogging, friending and Twittering themselves to exhaustion, hoping to catch the last seat on a train that’s already disappearing down the tracks. The loss of their self-respect is a personal tragedy; the loss of their gifts and experience is a national one. And of course the option of a life off the grid was never open to Americans who began their careers post-Gates, post-Jobs, post-Silicon Valley. We live in an age when even shepherds and forest rangers are probably wired from hat to boots.

The electronic express keeps rolling, and there’s no turning back. “Refuse it,” prescient advice Sven Birkerts offered to conclude his Luddite manifesto The Gutenberg Elegies (1994), now sounds as dated and wistful as the fight song of a team that lost 70-0. Yet those of us doomed to anachronism by technology enjoy many compensations, not the least of them the knowledge that we’re gravely underestimated by cyber-sophisticates who pity us. The rumor that I’ve been reduced to tears and profanity by the multiple remotes that control hotel TV sets is not entirely unfounded. Mr. Wizard I am not. But the myth that computers and allied appliances are simply too complicated for tired old minds is based on the youthful assumption that everyone wants to learn this stuff, that anyone would if he could. The truth is that nearly anyone can if he has to. When circumstances force us to operate these impudent twenty-first century machines, we fossils engage reluctantly but rarely fail. Survival-level computer skills are no neuroscience, nor were they meant to be. A motivated chimpanzee can do this, or even Randy Michaels.

It’s not so hard to teach an old dog new tricks—not if his dinner depends on it. But the more tricks he has to turn to fill his bowl, the more he’s going to hate you. And you can double the hatred—I speak for an incorrigible minority of old dogs—if he thinks most of your new tricks are stupid. The vices and virtues of the all-wired world merit ferocious cultural debate, but it’s hard to sustain a dialogue with young people who’ve never lived without the glowing screens, the blinking lights, the voices from the ether. Terms of discourse have changed as rapidly as hardware and software, and created the most prodigious generation gap in history. Rising generations—as opposed to sinking ones like mine—can’t be expected to grasp how quickly it happened, how it never evolved but exploded. Children born in Nagasaki after 1945 might have the same difficulty forming a clear picture of their city before the bomb. We, the obsolete, are obliged to argue from general principles. The purpose of technology is to make it easier to perform the essential tasks of our lives, tasks that include survival. But what if, instead or in addition, it merely creates tasks and problems for which there was no need, for which no human relevance can be logically demonstrated? What if it merely multiplies entities unnecessarily, in defiance of Occam’s razor? How many apps does it take to screw in a light bulb?

Nielsen reports that the average teenage girl in America sends more than 4,000 text messages a month, eight messages for every hour she’s awake. By an admittedly rough calculation that’s more social messages in thirty days than I’ve sent (through any and all media) in my entire life, now approaching two-thirds of a century. To me this statistic is as weird as a rumor that these girls roast and eat their pets. Apparently technology has activated some latent psychological, perhaps even genetic tendency toward incontinent interconnection, but I can’t imagine what we’ve gained from it. There’s no question that much has been lost—first and worst of all our privacy.

Privacy is the Great Divide. For the civilized—now the obsolete—it’s a primary article of faith that the tougher, the more impermeable the critical membrane between public and private, the more civilization flourishes. In the past, this has been one of the few important points of agreement between serious liberals and serious conservatives, and certainly among all the founders and architects of the Republic. Without privacy there’s no dignity, and without dignity “freedom” has no meaning. Thirty years ago, you could search in vain for an American who thought privacy was expendable. Yet recently one of the billionaires responsible for the rapid erosion of American privacy—some entrepreneur of the PC, cell phone or social network industries, I can’t recall—was asked how we could protect our privacy and replied dismissively, “Get over it.” “Get over it,” the bastard said, instantly converting my distaste and distrust to fear and loathing.

This is the grim place to which the heedless have marched us. The resistance—scattered, aging—will never produce a twenty-first century Patrick Henry to cry “Give me privacy or give me death.” But the death of privacy, like Goya’s Sleep of Reason, is breeding monsters. What do we make of the handsome Indian-American freshman at Rutgers, with an angelic smile and no history of antisocial behavior, who filmed and then webcast his roommate in a homosexual embrace? He never expected his victim to jump off the George Washington Bridge, but what did he expect? We’ll never know what this cherubic Iago was thinking, but just as hard to understand is that webcam/computer installations are standard equipment in freshman dorm rooms. When, why did all this electronic garbage become a generational norm?

A federal court convicted Ashton Lundeby, a seventeen-year-old North Carolina high school student, of masterminding an elaborate internet scheme that involved false reports of bombs planted at universities, high schools and FBI offices. Lundeby and his co-conspirators would call in a bomb threat to the local police, record each emergency response with surveillance cameras and then sell the footage to paying customers online. The Lundeby scheme required so much chutzpah—or criminal innocence—and so much expertise that we shake our heads in awe. But the confluence of electronic wizardry and commercial initiative is bound to remind us of billionaire Facebook entrepreneur Mark Zuckerberg and the bio-movie The Social Network that skewered him as a tormented sociopath. I never saw Avatar, but Zuckerberg and many of the other creatures portrayed in The Social Network were far more alien to me than blue people with tails in 3D. If they are the future, please help me find the door to the past. Hand me my crwth and my bow. Put the meat-pie hat with the ribbon on my head, and point me toward a market town where someone might spare a shilling for a tune.

Criticism from the sidelines, from the happily obsolete, has been ruled inadmissible. If you don’t play, who cares what you think about the players? Most of the recent books blaming psychic trauma on cyber-overload, like Jaron Lanier’s You Are Not a Gadget, have been written by Silicon Valley apostates with second thoughts about the digital revolution. But I don’t think any of us from pre-microchip generations have the right to shrug and look away. We—some of us—invented, marketed, and served them this bewildering array of gadgets. And there’s ample evidence that something ragged and unclean, something morally unsettling is loose among young people who could be your children and grandchildren, or mine.

Monsters whose obsessions result in crimes and lawsuits aren’t isolated cases, unfortunately. They’re nurtured in a flourishing internet subculture of bullies, creeps, and clowns. The New York Times ran a story about teenagers in an upscale suburb who were using cell phone cameras to record fights, savage beatings and violent stunts that compete for attention on websites like MySpace and YouTube. Many of the quotes in this story by Corey Kilgannon were chilling, jaw-dropping messages from a micro-culture gone mad. “Kids beat up other kids and tape it, just so other kids will see it and laugh,” shrugged one seventeen-year-old boy. “Or they just post stupid things they did online so other kids will look at their Web page.” His friend added, “Teens always do crazy stuff, but it’s just that much more intense and fun when you can post it. When you live in a boring town, what else is there to do?” And another explained, “Kids put their fights online for street cred.”

No less disturbing were the expert analyses the Times reporter solicited. “A lot of teens have this idea that life is a game and it’s all just entertainment,” said Nancy E. Willard, who wrote a book on cyberbullying. “In doing this, they’re jostling for social position and status, or establishing themselves in a certain social group, or just attracting attention. To them, this is defining who they are and what people think of them. The idea that ‘people know my name’ is an affirmation of who they are.” A role model for many of these suburban exhibitionists was a local man whose videos of himself hurling his body through neighborhood fences, an internet sensation, spawned a national fad. “A week ago, no one knew who I was—now my name has been on every news and talk show,” said this idiot, Adam Schleichkorn, now twenty-five. “I don’t care that it’s for something stupid. I was on Fox News cracking jokes. Maury Povich called me today. So I’m known as the fence-plowing kid. At least I’m known.”

You see what I mean by divergent terms of discourse. What does a grandfather on Medicare say to the fence-plowing kid, besides “Jesus Christ, kid”? Mark Zuckerberg, one of the world’s youngest billionaires, would be an equally tough lunch date. The infantilization of American culture seems to be an established fact—judge only from Hollywood films, which in the thirty years since I was a film critic have changed their demographic from predominantly adult to predominantly preschool. Another established fact is the link between the sedentary online life and an epidemic of obesity. But there’s a chicken/egg problem with severe psychological displacement. Were Americans already evolving into strange life forms that require more and better electronic toys to mirror and exhibit themselves, or should we hold the toys responsible for their transformation?

Only the most embarrassing old-timers claim that things were better, or that we were better, way back when. To the best of my recall, teenagers of the ‘50s and ‘60s were just as cruel and status-conscious and no less obsessed with sex, though we knew a great deal less about it. Adolescent lives weren’t better, perhaps, but they sure as hell were different, and the difference appears to be all about context, about expanding identity groups. We lived our lives for a limited audience of parents, siblings, teachers, classmates and neighbors. It wasn’t a very attentive or demanding audience, but only one teenager in a thousand—a great athlete, a beauty queen, a musical prodigy—ever imagined a wider, even a national audience. The rest of us accepted our limitations. Academic achievements might lead to opportunities for a richer, more comfortable life later on; catching a touchdown pass or hitting a home run could be converted into immediate status and sometimes even sexual currency, coveted and hard to come by. (Forgive the male point of view—to boys like us in those days, females were another country.)

Hardwired teens of the internet era see all the world as their stage. And sometimes it is. The common dream of online performers is to “go viral” like Adam Schleichkorn, whose original fence-busting video attracted 70,000 viewers. The most hardcore juvenile delinquent of the ‘50s would have been petrified at the thought of all those eyes. But now we discover a subculture where the natural need for privacy has been reversed, somehow, into a neurotic need for constant attention—attention of any kind at all. In this alternate universe it’s better to be disgusting, to be a figure of fun or an object of contempt, than to be invisible. Voyeurs demand, exhibitionists deliver, then they switch chairs. Vicariousness is all.

When did people begin to think of themselves as public offerings, as products they’re obliged to market and sell from cradle to grave? A typical teenager of fifty or even twenty-five years ago was alone in his locked room sulking, possibly even reading—possibly even reading something obscene. The typical (?) teenager in 2011 seems to be out filming himself to program his Facebook page or compete for attention on YouTube. This reversal is so radical, we could debate whether “personal technology,” in little more than a decade, has altered America’s DNA. But the eradication of privacy as a core human value doesn’t account for the cruelty, for the internet’s “culture of sadism” that the computer scientist Jaron Lanier decries in his book. It doesn’t account for the Rutgers atrocity or subhuman video “pranks” like attacking homeless men and beating a thirteen-year-old girl—both popular attractions on websites that encourage this spreading infection.

Is it possible that expanding the community immeasurably, from the dozens to the millions, dilutes all the positive aspects of community—compassion, loyalty, mutual support and responsibility? That as the community expands, communality contracts until dog-eat-dog rules again? With the whole anonymous world watching, at least in your imagination, do individuals lose substance, become little more than “viewers”—totaled online as “hits”—and Facebook “friends” you’ll never meet? You wouldn’t beat or betray a friend, but a “friend” is another matter. This is just a suggestion from the sidelines. No doubt the fence-plowing kid has a different explanation. The blessing conferred on those whom obsolescence has claimed is that we’ll never have to compete with “the kid” for the world’s attention. If you will, if you do, God help you.

Of course it’s a small minority of teenagers who beat up the homeless for publicity and post nude pictures of themselves online. But I’m afraid the majority is too marinated in the culture that breeds this behavior to see how grotesque and pathetic it’s become. It takes perspective. If you’re still able to take one step off the grid and look back, the view might shake you up, or crack you up. The dividing line between generations is laid down sharply in The Social Network when Sean Parker, loathsome founder of Napster, cries out in his cups, “People used to live on farms and in cities, now we’ll all live on the internet!” If Parker’s outburst chills you as a vision of an ultimate dystopia more depressing than 1984 or Brave New World—as the filmmakers intended—you’re on my side of the line. If you barely notice it, you’re lost on the far side.

Elsewhere the line is not so cleanly drawn. I confess that I laugh at people who buy and prize the Dick Tracy cell phones that do everything but brush your teeth and walk your dog. I also concede that I have extremely intelligent, serious, otherwise discriminating friends who poke away at the silly things with apparent fascination. These converts are not and will not be obsolete, perhaps as a matter of pride? But without dinosaurs like me to remind them, they’d forget what the old order was like. Have I disappointed them? What did I ever do to feed a suspicion I could be seduced by gadgets that invite me to gape at movies, stock reports, spreadsheets, pornography, weather maps and email on a screen half the size of a playing card, like some gorilla mesmerized by a Christmas tree ornament? Which of us is weird?

The last question can’t be answered definitively. In the few years since it opened, the high-tech superhighway—ten lanes, no speed limit—has carried most Americans far from their origins, and lured nearly all of us out of our comfort zones. But the highway is littered with accidents, too, and road signs that lead us nowhere. It’s mainly an aesthetic decision, finally, to pull off the road and turn off the ignition. It’s possible to remain physically and mentally vigorous thirty years after most of your contemporaries have faded away. But I believe that each of us has a kind of cultural expiration date, and there’s nothing more pitiful than a person who’s exhausted his cultural shelf life and doesn’t know it. Think of the sexagenarian who claims to love Eminem. How do you know when you’ve expired? Maybe when popular culture pushes you beyond contempt into physical nausea. I reached that place a while ago. If you’re old enough to remember Jerry Ford and you’re not there yet, you will be soon. Unless, of course, you’re one of the shell people with no core of sensibility or belief, ever ready to hitch a ride on anything that comes your way.

This isn’t a popular song I’m playing, or one you’re likely to hear again. But it isn’t a dirge. Or a swan song. Obsolete and unashamed, we feel like farmers who got their crops in safely before the hard rains fell. As storm clouds gather, it’s a time to walk the fields of stubble with the old dog, the hearthdog, and see, as the Bible says, that it is good. Then comes a long winter musing by the fire—though spring remains a question mark. The past is a flickering daydream, the present is turning ugly, and the future belongs to no one, though the kind of heirs we might have chosen seem less and less likely to inherit. We haven’t said “goodbye,” we’ve just said “enough.”

In common usage the word “obsolete” has become too pejorative. Many outdated things—Jim Crow, the Vatican, cigarettes, the two-party political system—wear out their welcome but fail to achieve extinction rapidly enough. Yet so much that’s obsolete deserved a better fate. We may be the last of our kind, but we flatter ourselves that we’ll be missed, perhaps even heeded more in the future than we were in the past. In my research I came across a lovely sentence in praise of the homely, ancient instrument that gave its name to my father’s family. With a minor lapse of modesty, I can pretend that the author is speaking of me: “For all of its (his) technical limitations, the crwth has great charm, and is much more than a historical curiosity.”  end


return to top