Category Archives

72 Articles

The Profound Limitations of Parental Agency

Long before I had kids, I recall my parents making the case that all of their children had pretty much formed their basic, enduring personalities by the age of two. They said as much more than once, invariably in the act of throwing up their hands in exasperated resignation, for much as they tried to shape their children’s characters further or cajole them into this/that behavior (and trust me: they did this a great deal), fundamental personalities almost always prevailed. On account of this experience, by the time their third kid (my younger brother) had reached high school, my parents had become markedly laissez-faire in dozens of ways that frankly annoyed my older sister and myself. “We never got away with that,” we’d grouse to each other.

Well, as has been the case in myriad respects, my parents were right. My oldest graduates from college this month and while I naturally believe him to be a lovely, capable kid in most every way (ditto for his younger sister), these young adults are each remarkably similar — in terms of sociability, focus, ambition, daring and temperament — to the two-year-old kids they were. Yeah, they’ve grown or excelled or lagged or flagged in these and various other respects. And I don’t believe anyone can or should stop parenting (I don’t think that’s possible). But there seems to me a remarkable, observable consistency of character that is more or less resistant to “parenting”.

I’m always amused when I come across yet another parenting book reviewed in The New Yorker or New York Times. I muse at the publishing industry’s having identified and exploited this incredibly willing (read: anxious) audience. Then I laugh outright, at myself, because I nearly always read them, too (the reviews anyway).

The irony is, as parents, we have an agency that simply isn’t so strong as we want to believe. Even if we accept our limited impact, on some level, this desire for agency tends to seep into other areas we believe we can control: etiquette, dress and manners; identification and pursuit of extra-curricular “passions”; geography (i.e., buying houses in towns with “good” school systems); self-esteem (i.e. “premier” soccer and other invariably commercial gambits); the entire SAT prep and college admission culture… There’s no harm in trying all this stuff, in doing one’s best. But it’s really a hit or miss affair, I’ve come to believe. Ultimately, 9 times out of 10, it’s down to the kid and his/her fundamental self.

This is not parental fatalism. It is an attempt to recognize (with serenity) the agency one has; to accept (without prejudice) those situations that are beyond one’s control; and (in a perfect world) to capably distinguish one from the other. I held onto this quote from Adam Gopnik’s January 2018 review of yet another parenting book, in the NYer:

As satirists have pointed out for millennia, civilized behavior is artificial and ridiculous: It means pretending to be glad to see people you aren’t glad to see, praising parties you wished you hadn’t gone to, thanking friends for presents you wish you hadn’t received. Training kids to feign passion is the art of parenting. The passions they really have belong only to them.

Surely environment matters. Even then, however, it’s not the environment parents provide that seems to matter most — or so writes Judith Rich Harris in the best book I ever did read on this subject, The Nurture Assumption (see here the book review that led to my buying/reading the book). In short, Harris argues that we assume our kids turn out the way they do according to a pretty even split between nature (genetic inheritance) and nurture (environment). “The use of ‘nurture’ as a synonym for ‘environment’,” Harris explains, “is based on the assumption that what influences children’s development, apart from their genes, is the way their parents bring them up.”

If this were true, siblings — who are as genetically similar as any humans can be (save identical twins); who are traditionally raised in the same household by the same parents — would all have very similar personalities. Anyone raised in a family of two or more children understands just how ridiculous that idea is.

Ever wondered why the children of recent immigrants don’t speak with accents, even though their heavily accented parents do? Or why the children of deaf/mute parents learn to speak at all? Put simply, Harris argues that a child’s peer group accounts for far more environmental influence in the long run — influence that, since Freud, had traditionally and unduly been attributed to parents. If we’re honest with ourselves, as parents, we’d admit that our children generally do put a lot more stock in the opinions, social mores and examples of their peers. To an extent, parents can help determine or control a child’s peer groups but that is the environment that matters (and the variability of peer groups helps explain why siblings turn out so very differently).

Of course, children do pick up quite a lot from their parents — most of it genetic. This is the other nurture assumption: that we pick up traits and habits and behaviors by copying our parents. Harris argues, persuasively, that we humans don’t do this nearly so often as is commonly accepted; most of those things we attribute to parental modeling are in fact inherited from parents genetically, not environmentally.

I’m on board with this idea behavioral genetics, too. Growing up, there were dozens of things that my mom and dad did that drove me absolutely crazy — and yet today, at 53, I find myself doing many of these same things. I didn’t “model” my behavior on them in these cases. Far from it. Still, I couldn’t resist these behaviors because they are genetically baked right in.

Which brings me back to my brother, and how my sister and I felt he got a sweet deal — coming third and last, by which time, my parents had given in to the power of personality (and behavioral genetics, though they wouldn’t have put it that way). Invariably, she and I would inveigh against this new libertine parental stance of theirs, or make some wise-ass comment in place of outright carping. At which point my mother or father would issue another pearl of wisdom, one we’d heard before, one that has nothing to do with nature or nurture but still rings true: We’ve never tried to treat everyone exactly the same around here. Everyone gets what they need.

Larry Sanders: I Never Knew Ye

Larry Sanders: I Never Knew Ye

I’ve never subscribed to HBO. There may have been a month here and there when it was provided to us here in New Gloucester, by mistake, or as part of some promotion, but when the cable monolith inevitably attempted to charge us, we balked. The movie-watching we missed as a result of this cultural diminishment we didn’t see as relevant.

However, many is the time I wish I had actually seen all those episodes of Curb Your Enthusiasm and The Larry Sanders Show.

Last year, from some Bangkok street vendor, I procured up first four seasons of Curb, for a ridiculously small sum. It was good. I had seen the odd show here and there. But ultimately I had trouble watching them en masse, to be frank. After 5-6 episodes, not even a full season, I found myself worn out but the sameness of each plot: No, Larry. No, don’t do that. Oh geez…

IFC started rebroadcasting The Larry Sanders Show in January and with a deft flick of my DVR settings, I have proceeded to record each episode, in order, from the very beginning of the show’s run in 1992. It’s hard to keep up. My family rolls its eyes when they glimpse the list of recorded shows and spy this sea of Larry.

I’ll temper my enthusiasm by saying the first two seasons of Larry Sanders were only slightly better than average — and something of a letdown when contrasted with the glowing tributes this series routinely garners from TV cognoscenti. These episodes didn’t suffer from a sameness, a la Curb, but I did find myself wondering why it is I am supposed to care about any of the main characters who are unfailingly funny but shitty.

Well, I can report that in Season 4 the show officially hits its stride. It’s not just easy for me to sit down and watch 2-3 episodes in a sitting; I make time for it. Indeed, I recently watched the fictitious talk show’s 8th anniversary special, and it struck me that a number of things have come together, revealing the show’s genius and explaining all the accolades I’d read and listened to over the years.

Read More

Forward, March! Dirt Driveway is Lone Beneficiary of Late Spring

As a Masshole, I have not earned (and will never earn) the right to publicly complain about winter weather here in Vacationland, lest I be called out by some actual Mainer as “a flatlander” who doesn’t “know what winter is”. Truth be told (and chastisers be damned), very little distinguishes southern Maine winters from those in Greater Boston. March is the exception. It is traditionally the most difficult month for my flatlander/Michigander wife and me. Down in Boston (and out in Kalamazoo), there might be a late-winter storm or two but signs of spring abound in March: the inevitable melt, up-creeping temperatures, budding trees… Here in New Gloucester, we don’t see those things until April, and with each passing year that proves a harder pill to swallow.

There is one advantage to this annual winter extension, however: The generous slather of ice and snow keeps our 600-yard dirt driveway smooth and comely. Indeed, it never drives so well as during the months of January, February and March. It’s supposed to snow another foot tonight (March 12), meaning we can expect to enjoy burnished, aesthetically pleasing driveway conditions throughout the month. When we thank heaven around here, this is what passes for a small favor.

Reared in the suburbs, I knew nothing of dirt driveways and their upkeep prior to our landing here in the spring of 1998. Like any new homeowner, I learned these ropes on the job.

Read More

Awfully Fond (and Proud): Sesame Street’s Founding Generation

I have a distinct memory (among my very earliest) of my mother describing a new TV show that was about to air on PBS. “It’s for kids exactly your age,” she told me, and so it was. Sesame Street debuted in late 1969, when I was 5. In a home where screen time was highly restricted (our Sony Trinitron representing the only screen), Grover, Ernie, Bert, Maria, Mr. Hooper, Kermit, Gordon, Guy Smiley & Co. proved a staple of my early cultural sentience. It occurred to me recently that without the enthusiastic approval of kids my age — this founding Sesame Street cohort — the show might not have survived or become such a thing. And what a thing: 48 years and counting.

While channel surfing through the upper, premium reaches of my cable guide, I never seem to happen upon Sesame Street. Yes, today the show airs on HBO. You may have read about this arrangement whereby first-run episodes can be found there on Saturday mornings; eventually, they cycle back onto PBS in a post-modern form of syndication. I never see it there either, to be honest (my viewing habits are too nocturnal). It made this transition 2 years ago and I gather the show continues to wear extremely well.

Buoyed by the idea that this hugely influential, 50-year old show retains “the brassy splendor of The Bugs Bunny Show and the institutional dignity of a secular Sabbath school,” I’ve been conducting an experiment these last few weeks: I’ve been mentioning Sesame Street to folks generally my age and paying attention to their mood in reaction. If it generally brightens, I know they are fellow members of this my cohort… If I make a Cookie Monster or Roosevelt Franklin reference to someone just 4 years older, however, the reactions differ quite markedly. Often they don’t get it, or they will roll their eyes and make it clear they didn’t really watch Sesame Street. This makes sense: When the show debuted, these elder folks had already aged out.

Read More

Palestra Tales, 40 Years in the Making

 

PHILADELPHIA — When we learned my daughter Clara would matriculate at the University of Pennsylvania, naturally her dad was thrilled: Here was my chance to make a proper pilgrimage to The Palestra, the most storied college basketball venue of the 20th Century.

As I’ve written here before, while my hoops allegiance today favors the overtly professional NBA, there was a two-decade period starting in the mid-1970s (just as John Wooden’s run at UCLA came to end) when I was a far more fervent college basketball junkie. The Palestra was central to that emerging fandom, which just happened to coincide with the sport’s surge into the national sporting consciousness.

College basketball and the NCAA Tournament are so popular today, so ubiquitous on television, it’s easy to forget their dual ascension is relatively recent. For all intents and purposes, UCLA and its 10 NCAA titles from 1962-75 effectively stunted the sport’s broader popularity (when certain teams/programs utterly dominate an underexposed sport, big cultural awareness only comes when some ridiculous win streak is snapped; think UConn, whose dominance has stunted women’ college basketball in the same way). Men’s college basketball should have taken off in the 1960s, but it didn’t because the only time anyone paid attention was when UCLA got beaten: first by Houston (1968’s famous Astrodome game), then by Notre Dame in 1973. These losses proved to be mere blips; the Bruins eventually won national titles both years. But someone finally did beat them when it counted (NC State, in the 1974 national semifinal). Then Wooden retired with one last title, in 1975. Suddenly the field was open and seeded. Take it from someone who was there: The idea that some team other than UCLA could win it all each year was novel and beguiling (!) — only then did the sport truly take off.

The Palestra (bottom right) sits directly beside historic Franklin Field, home of the Penn Relays and where Santa got booed in 1968. It also hosted the Philadelphia Eagles’ last NFL championship (1960). We visited Feb. 3, 2018, one day before the Eagles did it again.

Growing up in New England at this time,  our interest had already been piqued by a Providence College team led by Ernie D, Kevin Stacom and Marvin Barnes. The Friars went all the way to the Final Four in 1973 — that year WJAR Channel 10 out of Providence started televising a bunch of PC games. The following year, rival WPRI Channel 12 took the talented University of Rhode Island teams (led by Sly Williams) under its broadcasting wing. Even obscure UHF stations like Channel 27 out of Worcester aired weekly games (each of them called by Bob Fouracre and his magnificent toupée) featuring Holy Cross mainly but also Boston College — even tiny Assumption College, led by the immortal Billy Worm (look him up; he was a stud).

Soon the national networks and their affiliates in Boston got wise and started televising big regional games every Saturday afternoon. Here is where I got to know The Palestra. Hoop-rich Philadelphia was home to The Big 5, a city series featuring local rivals Villanova, Penn, St. Joseph’s, Temple and LaSalle. Every Big 5 game was played at The Palestra and these were the games I watched with manic intensity each weekend. These were the memories dislodged to glorious effect earlier this month, when Clara, Sharon and Philly-born, erstwhile golf freak Mike Sweeney watched the Quakers beat Yale, 58-50.

When the 10,000-seat Palestra opened in 1927, it was among the largest indoor sporting venues on Earth (the name is derived from the ancient Greek term palæstra, a rectangular space attached to a training facility, or gymnasium, where athletes would compete in public, before an audience). Today it’s a bandbox but still all I could have hoped for: seating stacked steeply with front rows right on the baselines/endlines; vaulted ceilings filled with banners; exposed brick everywhere — pretty much exactly as I remember it from the mid to late ‘70s.

But there was more to our Feb. 3 visit. Quite a bit more.

Read More

Like carrying ‘a Rolls Royce with buckskin seats,’ only lighter…

Late January in the golf realm is traditionally dominated by the PGA Merchandise Show in Orlando. Even if one doesn’t attend (as I did not), industry types and golfers alike are invariably bombarded this time of year by attendent product news, hailing the latest and greatest from all corners of golfdom. I received this morning a press release re. the vaunted Mackenzie Walker. I no longer “carry”, as they say; the ol’ L4/L5 and S1/S2 discs won’t allow it. But I did report on this specific subject once upon a time, for the dearly departed Golf Connoisseur. Glad to see the company (if not the magazine) is still in business.

Considering all our outward reverie for tradition and history, today’s golfers would appear to have very few practical retro options. Yes, we can walk, take a caddie, wear a Hogan cap or perhaps re-attach to our shoes those god-awful kilties. But we don’t see modern players making any truly meaningful throwback gestures, such as forsaking his Pro V1 for a Haskell — or even an Acushnet Club Special. We don’t see them trading micro-fiber for tweed. Yes, Old Tom Morris reportedly made one helluva niblick but the market for one, today, is limited to collectors and hickory-wielding re-enactors.

This is precisely the beauty of the Mackenzie Walker, the all-leather carry bag that was first introduced in the 1980s, fell into obscurity amid a hail of ownership failures but has re-emerged under the aegis of Oregon-based professional Todd Rohrer. It’s a niche market, to be sure, but the sumptuous, hand-sewn Mackenzie bag (which, when slung across your shoulder, feels like a comfortably worn club chair, only not nearly so cumbersome) is beginning to gain traction at some of America’s finest clubs — perhaps as a statement of principal in an ever more titanium-reinforced world.

“Technology makes the game a little more enjoyable, but so does this,” Rohrer says, while gently stroking two new shipments of buttery leather, one in black, the other champagne. “The first bag I make out of this stuff is going to look like a Rolls Royce with buckskin seats.”

The first Mackenzie bag Rohrer ever saw was black. He was managing The Reserve Vineyards & Golf Club in Portland, Oregon; it was the late 1990s, during the Fred Meyer Challenge, “and Peter Jacobsen came walking across the practice green with the coolest black leather Sunday bag I’d ever seen. I was like, ‘Whoa…’ These bags evoke strong emotions. They just make people feel good.”

Jacobsen was an early backer of the Mackenzie phenomenon; indeed, he and his brother, Dave, named the product. Not for Alister, the architect, but for Rick MacKenzie, their caddie during a 1985 trip to Scotland (and now the caddie master at St. Andrews). That was one spelling corruption and several ownership groups ago. Rohrer is the new keeper of the flame (www.mackenziegolfbags.com) and he’s determined to “refine” the bag without messing with it.

“For example, the round ring here at the top of the bag. It used to be a piece of steel we got from Mexico, but through my sewing machine mechanic I found an experienced welder who just happens to sculpt in metal. Now the ring is hand-formed stainless steel and the weld on it is just about a work of art — and you’ll never even see it because we sew it into your bag!”

Ditto for the lighter, 50-gram composite fiber batten (replacing a 675-gram metal frame) that provides the Mackenzie Walker just enough structure, while maintaining its requisite Sunday-bag slouch.

Otherwise the Mackenzie bag remains gloriously low-tech, unchanged and unadorned. No double-helixed nylon straps. No insulated water-bottle receptacle. No special compartments for, well, anything really. They’ll hand-sew you some lovely barrel-style head covers but, outwardly, there will never be more to a Mackenzie Walker than a single strap, a couple pockets and impossibly soft leather.

Okay, a bag stand would be nice. Some day. Maybe.

“We’ve had that conversation,” Rohrer admits, a bit warily. “But if we ever do one, it will be the most damnably elegant bag stand you’ve ever seen.”

Long Story: Why Rugby’s Distant Cousin has Replaced Tackling with Hitting

What’s wrong with this picture? Stefon Diggs (14) scored a winning, last-second touchdown on Sunday because Marcus Williams (43) went for the hit, not the traditional tackle…

Having basked in every last detail of Sunday’s miraculous walk-off touchdown by Minnesota Vikings wide-out Stefon Diggs, let’s connect a few dots, for in so doing we link the NFL’s signature moment this season to the league’s most pressing issue.

Look at the picture that accompanies this essay and examine with me what New Orleans Saints safety Marcus Williams was thinking.

We should first take a moment to pity the man, a rookie whose coaches put him in a god-awful position — “on an island,” as they say, by himself defending half the field when the situation clearly called for the Mother of All Prevent Defenses. Even in this highly vulnerable position, however, all Williams needed to do was play deep center field, keep Mr. Diggs in front of him, eventually wrap him up and wait for help, or bring him down, ideally in the field of play (but even a shove out of bounds would have sufficed).

Instead, Williams did what most professional footballers tend to do in the 21st century: He went for the “spectacle hit”, head first.

Competitively, as we’ve seen, the results were disastrous. (Williams even managed too compound his misfortune, somewhat comically, by whiffing on Diggs entirely, then taking out his teammate — the only guy in a position to chase Diggs down.) But if we step back, we see here yet another consequence of football’s troubling evolution on the defensive side of scrimmage. Despite a litany of league-wide initiatives to curb headfirst tackling — the result of mounting evidence linking repeated, football-related head trauma to brain injury (chronic traumatic encephalopathy, or CTE) — the NFL’s hit culture remains firmly in place. Even in a situation like Sunday’s, where old fashioned, rugby-style tackling was called for, Williams acted on the instinct that football today engenders.

NFL Football in the here and now is plenty good fun, the most popular and culturally dominant game in 21st century North America. Minnesota’s unlikely victory on Sunday (indeed, three of the four games this past weekend) showcase exactly why this is so. NFL games can be spectacularly entertaining.

But it would be a stretch to consider the game of professional football “perfected”. In reality, any sport played at the elite level exists as a moving target, a work in evolutionary progress, because the salient factors affecting that evolution — rules, tactics, equipment, geography and fashion — also shift and evolve. All this transforms the way a game is played over the course of time, sometimes by design, sometimes organically without much guidance at all.

In 2017, we can add “culture” and “the legal process” to this list of salient change-agents. People took notice when former NFL player Ed Cunningham resigned from his position of ESPN football analyst — on account of the game’s growing concussion dilemma — but, in truth, we’ve become somewhat inured to stories like this because nearly every week brings some new, relevant development, be it evidence that concussions sustained in pee wee football can lead to adult brain trauma, or steps the Canadian Football League has taken to reduce the volume of dangerous hits.

The idea that former Patriots tight end and convicted murderer Aaron Hernandez might have committed his crimes while experiencing advanced-stage CTE adds to this potent mix the elements of irony and the macabre. Did you know that a class-action lawsuit, brought on behalf of current and former NCAA student-athletes, remains pending before Judge John Z. Lee of the United States District Court for the Northern District of Illinois? Me neither. Class actions have their own online portals these days, naturally. Visit this one and be prepared for the following greeting: “Welcome to the NCAA Student-Athlete Concussion Injury Litigation Website.”

Bit by bit, the forces of change would appear to be gathering over football, as they have intermittently but more or less continuously for more than a century. No game, it seems to me, has evolved so far, so quickly or so dangerously.

Read More

System Error 23: Bad Disk or File Name

[See below a 1996 article from The Harold Herald, the world’s first blog, which I invented in the early 1990s. Yeah, you heard me right … The act of ‘composing at the keyboard’ is so ingrained today, one can forget when and why that started — and just how many technological eras our lives have spanned since. The newspaper that first employed me was still waxed and ‘pasted up’ on boards, with photos carved in with exacto knives…] 

As I prepare to discard the computer on which I truly learned to type, compose at the keyboard and play video games, I’ve come not to bury the ol’ ATT 6300 but to praise it. After doling out the praise, however, it’s headed for the scrap heap.

For 11 years, this IBM knock-off served various housemates and myself extremely well under the most trying circumstances. I dare say, no unit still operating has endured more moves, beer-dousings and random acts of neglect than has our intrepid ATT 6300.

Harold Herald Virtual Editor Dave Rose was the original owner, having purchased the machine via a special Wesleyan University discount deal prior to our senior year. Today, its game graphics would pale by comparison to, say, those of any Fisher Price product. But back in 1985, this baby was state of the art.

In the years preceding Dave’s monumental purchase, I had no PC experience whatsoever. Hardly anyone did. For the first two and a half years of college, for example, I would write papers long hand. It was imperative that I produce a finished draft two days in advance, leaving me an entire evening to hunt and peck the final product on my enormous, ’50s-era electric typewriter, which my dad found at the dump and refurbished. These “typing” sessions were trying times for my housemates and me: evenings laced with profanity born of frustration and pungent White-Out fumes as disorienting (in their own way) as Thai stick.

Behold, Digger: This would be Screen 3, I think. Back in the day, I progressed as far as Screen 12…

Late in my junior year I took to typing papers on the university’s main-frame computer, which was painfully slow and inconvenient as it was located in the Science Library as opposed to our house.

All this changed senior year when Rose bought the computer, thereby opening up a whole new world to the residents of 8 Warren St.

The video games, crude though they were, proved the ATT 6300’s most enduring legacy. Sure I wrote my thesis on this machine but, more important, I also shattered the world Digger record some 10 separate times! I am not a talented nor part particularly ardent gamer, but I made myself the all-time Digger champion through relentless practice. This involved repeatedly drawing myself a draft beer (we were on tap 24 hours a day, 7 days a week my senior year), going upstairs to Dave’s room and “Digging” until something more important came along.

Digger was a sort of Pacman knock-off. Space Vades, a thinly disguised copyright infringement of Space Invaders, was another 8 Warren St. mainstay. There were innumerable Star Wars-inspired, fighter-jet “shooter” games, several of which made their marks as the next late-night obsession of Dr. Rose and perennial roommate Dennis Carboni.

Come to think of it, I associate much of the computer’s nocturnal use with Dennis, a.k.a. The Bone, That Bone, Bonish, El Carbon and (my personal favorite) You Goddamned Fuckin’ Bone.

That Bone was one of the world’s great procrastinators. He never started a paper until 3 a.m. the morning it was due. Invariably, I would get up for class, poke my head into the computer room and Dennis would smile back, his eyes bleary but illuminated by the monitor.

“How’s it coming, you goddamned Bone?”

“Oh, hey … No problem: 11 o’clock class.”

Obsessive nearly to a fault, Dennis and Dave would often become utterly engrossed in some new DOS-based computer game via the 6300 (in the same way they became engrossed in things like mail-order blow guns, palindromes or the album art of David Bowie). Invariably, they would play new video-game pursuits late into the night. Rarely, however, would Rose outlast the Bone.

One night Rose and Bone secured some flight simulator software that enabled them to “fly” Piper Cubs, in real time, with functional control panels. After watching Rose navigate his way from Boston to New York City, I went to bed. It was interesting but quickly became tedious as the screen went a dull, blank green when one left Greater Boston. Such primitive graphic cards didn’t show any topographical detail until one approached Laguardia.

I saw Dennis the next morning and he looked like hell.

“Bone, you look like hell,” I said.

“Yeah, after you went to bed I flew to Salt Lake City!”

“How long did it take you?”

“Seven hours.”

Read More

HH Flashback: Misery Can Neither Be Created Nor Destroyed

[See here an archival excerpt from The Harold Herald, the world’s first blog, which I invented in the early 1990s. Yeah, I did… One of the things that made the HH special, and thereby transcend the as-yet-created blog genre, was the fact that we attracted scads of talented contributors. Dave Rose was one of these, and here we reprint one of my favorite bits, first published circa 1995, when CO2 levels were still sorta quaint. But with the onset of winter here in Maine, and wildfires raging across Los Angeles County, it remains damned timely.]

By DR. DAVID ROSE

BOSTON, Mass. — From a meteorological perspective, this winter has been a particularly difficult one in New England. The ground here has been snow-covered for at least a month, and each time the snow begins to retreat a new storm sets in, dumping a foot or two of the white stuff on the city’s long-suffering populace.

In times like these, even the most stalwart, Eastern masochist can cast an admiring eye to the South or West, imagining more comfortable — if less character-building — Februarys. In weaker moments we are all capable of believing we would be less miserable if only the weather were better.

What few people realize, however, is that misery — like matter, energy or gravity — is a measurable entity subject to strict physical laws. Paramount among these is the law of conservation of misery, which states that misery can be neither created nor destroyed. What the law of conservation of misery means is that each human being is subject to a fixed quantity of misery during his or her lifetime. This “misery quotient” is absolutely immutable, a constant that holds across socioeconomic groups and geographic boundaries.

The law can be demonstrated in the field by measuring and tabulating misery in test subjects by using sensitive, electronic monitoring equipment. In the following study, diary entries for three individuals are followed by the amount of misery experienced by each, expressed in misery units (MU).

Subject 1, Los Angeles, Calif.

Day 1: Beautiful day. Saw Erik Estrada at Arby’s (.002 MU)

Day 2: Beautiful day. Discussed Rolfing with a Scientologist. (22.001 MU)

Day 3: Beautiful day. Around noon my house ripped loose from its foundation, slid down a hill, burst into flames and was swallowed up by a huge fissure that opened in the Earth. I was trapped for four weeks and was forced to drink by own urine to survive. One of the paramedics looked just like Kevin Bacon in Footloose. (1223.12 MU)

Subject 2, Tallahassee, Fla.

Day 1: Beautiful day. Stayed in the trailer and ran the air conditioner. (.003 MU)

Day 2: Beautiful day. Noticed that some, but by no means all, of my neighbors bear a striking resemblance to Gomer Pyle. (12.4 MU)

Day 3: The morning was beautiful, but in the afternoon I was mistaken for a German tourist and shot in the head, doused with gasoline, and set afire during a hurricane that destroyed the entire trailer park. (1232.72 MU)

Subject 3, Boston, Mass.

Day 1: Mixture of snow and sleet. Frostbite in right foot. (415.041 MU)

Day 2: Mixture of snow and freezing rain. My right foot has become gangrenous, and the stench is unbearable (415.041 MU)

Day 3: More snow. However, I reflected today that my house remains intact and this gave me a sense of stability and well-being. Right foot amputated. (415.041 MU)

Note the three subjects had very different experiences during the test period. However, the total amount of misery endured by each subject is identical (1245.123 MU).

While life in Boston is characterized by an endless series of petty humiliations and annoyances, life to the South or West consists of long stretches of inane, vapid, colorless contentment punctuated by absolute cataclysm. You can take your pick, but you can’t avoid misery altogether.

And before you move to warmer climes, consider the fact that spring will bring nicer weather to Boston, whereas Gomer Pyle lives in Tallahassee year ’round.

Herald Science Editor David Rose, PhD, is the world’s foremost authority on suffering. While he still gets a charge from the warranted misfortune of others, he specializes in chance trauma and self-imposed misery. He once dieted for two weeks on nothing but chicken boullion and carrots. His latest book, “I’m Wretched, You’re Wretched” (Knopf, $14.95), was published in February.

That Night a Mouth Roared and a Light Went Out

 

Like many others that fateful night 37 years ago, Dec. 8, 1980, I learned of John Lennon’s death from Howard Cosell. Yeah, that Howard Cosell. It was Monday night, the Patriots were in Miami, and, in 1980, Howard was still presiding — in his inimitably pedantic, overly dramatic fashion — over Monday Night Football, what in the pre-cable era was the week’s premier sports broadcasting event. Howard was respectful of the news, as respectful as his bombastic persona would allow: He treated it as he would a punt returner who has broken clear of the pack with only the kicker to beat. See that bizarre media moment, preserved for all time, here. ESPN would later weigh in with a meta-media doc, here.

My dad and I always watched MNF and we were stunned, naturally. It was legitimately stunning news delivered by a most unlikely source, in a peculiar context. The Pats’ left-footed, English place kicker — John Smith (from Leafield, Oxfordshire) — was lining up a field goal attempt when Cosell abruptly altered the narrative. The only thing that would’ve made it more bizarre? If Smith had hailed from Blackburn, Lancashire.

We called my mother into the room. She was the founding and still chief Beatles lover in our family, and John was clearly her favorite. She was 41 in 1980, essentially the same age as John Lennon. She had latched onto them from the start; indeed, my dad had teased her for digging a band whose enthusiasts were, at that stage, mainly 13- and 14-year-old girls. But my mom possesses a keen musical sensibility and her early support for their chops were more than justified in the years to come… She teared up listening to Cosell bloviate then left the room.

Not sure why, but the holiday period tends to include a lot of Beatles content on PBS. Just last week I saw that Ron Howard’s “Eight Days a Week” was featured, along with something called “Sgt. Pepper’s Musical Revolution”, as part of a fundraiser. All these years later, the Beatles are considered subject matter for the whole family, apparently. If you should get the chance, make time this month to watch the superb documentary “LENNONYC”, about his post-Beatles years in New York City (I saw it on PBS, but today you can catch it online, here). It was an eventful decade that followed hard on the band’s break-up, in 1970. For Lennon it featured a gaggle of outsized characters and spanned a remarkable procession of music-making, protesting, drug-taking, deportation-resisting, legal wrangling, breaking up, getting back together, child-rearing and, ultimately, growing up. That was the message one took away at film’s close: Here was a guy who had finally shed the latent adolescence of rock stardom and become a man, in his own right, only to be killed by a psychopath at the exact moment that maturity was to be revealed — his gorgeous new album, “Double Fantasy”, was released on Nov. 17, 1980). I don’t know that it gets much sadder than that.