In 2013, the New York Times published an article profiling a local sub-elite marathoner named Greg Cass, who was balancing his job as an investment banker with the personal quest of running a sub-2:30 marathon. At one point, the story made a comparison between sub-elite racers like Cass—those who finish in the top percentile but don’t race professionally—and the heroes of the amateur era of running in the United States.
During the 1970s, even the best runners in the country—like four-time Boston Marathon winner Bill Rodgers and Olympic marathon champ Frank Shorter—held day jobs to supplement their training. While Cass’s athletic aspirations were more modest than those of Rodgers or Shorter, Times reporter Jeré Longman suggested a shared “seriousness of purpose” in their respective approaches. The goal wasn’t simply to take part in a race—it was to run as fast as they could. It’s an attitude, Longman implied, that no longer reflects the current zeitgeist:
“This [amateur era] was an era unlike today when marathon fields have swelled to tens of thousands of runners and participation has become more important than competition for all but a relative few. By 2011, the average finishing time at the New York City Marathon was a pedestrian 4:28:20 for men and 4:44:35 for women.”
On the one hand, those pedestrian finishing times are an inevitable consequence of the sport’s growth. There are almost four times as many marathon runners in the United States today as there were in 1980, according an industry survey from Running USA. The same survey also reveals that there are now 25 percent more runners over the age of 40.
But the Times article also hinted at something more counterintuitive considering the rise in participation over the past three decades. Despite several hundred thousand more marathoners, fewer of them are really good.
“My 2:38 has become a lot faster over the years,” George Hirsch, chairman of the board of the New York Road Runners, told the Times, referring to the PR he set at the 1979 Boston Marathon. Back in the day, Hirsch said, his time was respectable but nothing special. Plenty of people ran a sub-2:40 marathon. That no longer appears to be the case.
“At the professional end, American distance running is at the best place that it’s ever been,” says Ryan Lamppa, a former independent contractor for USA Track & Field, past media director for Running USA, and onetime self-described “starving-artist runner” who moved to California in 1985 with dreams of making the U.S. Olympic team. “But when I first moved to Santa Barbara in the mid-’80s, our local races were much more competitive,” Lamppa says. “From my memory, back in the mid-’80s, if you won a local 10K of note, you probably had to run close to 30 minutes or faster. Now it’s like 33 to 34 minutes.”
If his name sounds familiar in this context, it’s because Lamppa has addressed the issue before. In a 2013 Wall Street Journal article titled “The Slowest Generation,” he lamented that there are fewer fast runners in their 20s and 30s now than when he was in his prime—a sign that this younger generation is perhaps lacking competitive vigor. Millennials, unsurprisingly, were not thrilled with this assessment; some of them even dismissed Lamppa as one more curmudgeonly baby boomer.
Since recreational opportunities have proliferated over the past few decades, attempting to compare the relative competitiveness of succeeding generations might be a fool’s errand. (Alas, we’ll never know how many potential Michael Jordans or Frank Shorters succumbed to the siren song of yoga or glow-in-the-dark dodgeball.) That said, if we restrict ourselves to looking at the progression of finishing times in the two most high-profile American marathons—New York and Boston—there is some evidence to support Hirsch and Lamppa.
In major races like Boston or New York, the 100th-place finisher is comfortably in the 90-somethingth percentile of all marathon runners. This was the case even in 1977, when both races already had a few thousand participants. Hence, 100th place is a sensible barometer to gauge the level of the serious amateur.
Scanning the times from 1977 to 2015, one is struck by how performances appear to have declined since peaking in the early ’80s. Between 1978 and 1984, the 100th-place runner finished under 2:30 no fewer than 11 times. That has happened only once since 1988. Conversely, since the year 2002, there have been eight instances where the 100th-place runner was slower than 2:40, which had not happened a single time in the previous 25 years.
This is all the more surprising when one considers that the East African dominance in road running didn’t really begin until the mid-to-late 1990s. When several of the top ten spots are occupied by Kenyans and Ethiopians, as has consistently been the case in recent years, it seems reasonable to assume that other fast runners would be displaced, resulting in quicker times farther down the results. That hasn't happened.
“What you’ve identified is real, but I don’t know what the consequences of it are,” says Amby Burfoot, winner of the 1968 Boston Marathon and former editor of Runner’s World, who points out that slower times haven't resulted in a drop-off in overall race participation or the disappearance of U.S. runners on the world and Olympic stage. Still, Burfoot says local heroes have lost a step over the years.
“The fast sub-elites, or what we might call the regional champion, those people are definitely slower,” Burfoot says. “We see that in our local road races. There are races where I live in southeastern Connecticut, where, by and large, the times are slower than they were 30 or 40 years ago.”
The question is: Why? Burfoot has theories but says there’s no authoritative explanation. He says that Boston and New York used to be the races to run if you were any good, whereas fast runners now have many more options. At first glance, that may sound convincing, but it’s not as if Boston and New York have lost any of their prestige; both races have seen participation increase by tens of thousands since 1980.
Burfoot proposes another possible reason for the decline of the American amateur, one premised on changing incentives and running habits. In recent years, a rise in general health awareness has been accompanied by numerous studies suggesting that one gains similar benefits whether running ten miles a week or 70. Some studies even show that ten may be preferable. In the 1970s, Burfoot says, it felt like all the runners he knew, even those lacking any natural ability or competitive aspiration, were running 70 miles a week or more.
As I’ve argued before, running for one’s general health and running for competition are not the same thing. To become a respectable sub-elite marathoner, you have to dedicate an enormous amount of time and effort to an activity that will never have any financial payoff. “There’s no reward to being a top regional runner,” says Burfoot, who never made a dime from his road-racing career, despite winning the 1968 Boston Marathon in 2:22:17. “It’s a nice ego stroke to win a local 10K, but there’s literally no money in it, and we live in a world that is ever more consumed by money and the need to make it, or find it, or whatever.”
We also live in a world ever more consumed by social media and the attendant need to broadcast one’s making it or finding it to others. But Instagram might ultimately be a medium ill-suited to communicate the absurd (and highly personal) sense of gratification one might gain from running just the slightest bit faster. In other words: amateur runners have to want to be fast purely for their own satisfaction.
Greg Cass, who is still looking to get past the 2:30 barrier, agrees. “The thought of going under 2:30 is in many ways only applicable and interesting to me,” he says. “To the broader public, there’s not much difference between a 2:58 and a 2:28."
Subscribe to Outside
Save 66% and get All-Access: Print + iPad