This month’s issue of Medicine & Science in Sports & Exercise features a debate on the merits of “hypoxic training”—that is, training in the thin air of real or simulated altitude for the purposes of enhancing endurance. On the surface, it’s a heavily mismatched debate: in the decades since the idea was popularized in the lead-up to the 1968 Mexico City Olympics, altitude training has become almost compulsory for elite endurance athletes, and it has been heavily studied by scientists around the world. There aren’t many doubters left.
Still, there are a few. At a conference in Amsterdam last month, I met Christoph Siebenmann, the Swiss researcher (currently at the Institute of Mountain Emergency Medicine in Bolzano, Italy) who co-wrote the skeptic’s take in MSSE. Hearing Siebenmann present his case in person and chatting with him over dinner helped me to understand where he and his co-author, University of Wisconsin researcher Jerome Dempsey, are coming from. Here are some of the points that stuck with me.
Altitude Training Does Work in Theory
This is a key concession: we’re not arguing about whether the idea makes sense, but about whether it works in practice. The theory relies on two basic assumptions: (1) spending enough time at altitude will trigger an increase in the amount of oxygen-carrying red blood cells in your body; and (2) increasing your red blood cell volume will enhance your endurance.
Both of these things are true. Siebenmann himself published a study in 2015 in which volunteers spent 28 days at 11,300 feet (3,454 meters). By the end of this stay, they increased their volume of red blood cells by an average of 99 milliliters.
It’s also clear that adding red blood cells boosts your performance. In a recent study of the microdosing approach to blood doping, a transfusion of 135 milliliters of red blood cells improved cycling time trial performance by about five percent. Connect these dots, and you have good reason to believe altitude training should work.
The Dose Makes the Magic
But there are a couple of problems with this math, Siebenmann says. First, actual altitude training usually takes place between 6,500 feet (2,000 meters) and at most 9,800 feet (3,000 meters), because of the risks of disrupted sleep, poor training, and altitude illness at greater elevations. At the lower altitudes used in practice, you’d expect a smaller boost in red blood cells than Siebenman’s 99 milliliters.
Also, the microdosing study involved subjects with an average VO2 max of around 60 ml/kg/min, which is typical of well-trained but not elite athletes. Earlier research has found that truly elite athletes only get about half as much benefit from blood doping as athletes with a VO2 max in this range. In fact, in a 1986 study of four elite runners with average VO2 max of 77, adding 200 milliliters of red blood cells didn’t do anything; adding 400 milliliters produced a small improvement in VO2 max; and adding 600 milliliters gave a bigger improvement.
So Siebenmann’s revised numbers suggest that if you take a realistic altitude training scenario of three or four weeks at a moderate altitude, you’ll get a boost of less than 99 ml of red blood cells, while an elite athlete would need a boost of more like 400 ml to see a measurable performance benefit.
Training Camps Work at Any Altitude
To counter Siebenmann’s mathematical argument, coaches and physiologists have their own experiences: they’ve watched countless athletes head to altitude camps, boost their red blood cell count, and improve their performance. Numerous studies have observed the same thing. How can this contradiction be explained?
One option is the training camp effect: you send athletes away to an idyllic mountain resort, away from the stresses of daily life, and tell them that this is their opportunity to make a big gain in fitness. As a result, they train like animals, sleep like hibernating bears, and emerge ready to conquer the world.
And there’s a further wrinkle: it’s very difficult to run blinded studies of altitude training, so athletes are well aware of whether they’ve been assigned to the “good” group that’s expected to get better, or the control group whose expected role is to stagnate. This has easily predictable effects on how hard the athletes train.
For example, if you look back at the classic 1997 study that first established “live high, train low” as the altitude training approach of choice, you can see this in action. The data here shows “Trimps,” which is a measure of training load that combines duration and intensity, for three groups: live low, train low; live high, train low; live high, train high:
During the altitude training weeks, the high-low group has a training load that’s roughly 50 percent higher than the low-low control group. This difference wasn’t statistically significant, but after eyeballing the data it doesn’t seem all that surprising that the high-low group (grey circles) ended up producing the best race results. Similar patterns show up in other altitude studies, with the altitude group simply training harder than the control group.
There have been a couple of attempts at double-blinded altitude experiments, in which athletes live in altitude chambers for weeks at a time where the oxygen settings of each room are kept secret. One of those studies was led by Siebenmann back in 2012; neither found any improvement in performance compared to the control group.
Not Everyone Responds
Even if you dismiss all these concerns, it’s still well-known that some people respond well to altitude training, while others don’t. In the 1997 study, they divided the results of their 39 subjects into 17 responders, who improved their 5,000-meter times by an average 36.6 seconds; 7 neutral responders, who didn’t see any significant change; and 15 non-responders, who actually got slower by an average of 24.0 seconds. As Siebenmann points out, that’s not a non-response, it’s a negative response. And if you’re an elite athlete, 17 out of 39 doesn’t seem like great odds.
There’s been lots of work since then trying to identify what causes non-response. Some of the potential culprits, like iron deficiency or calorie shortage, may be possible to rectify. Even if they aren’t, you’d like to think you could predict who the responders and non-responders are to make sure you don’t send athletes to a training camp that ends up making them worse.
But a 2010 study from the Australian Institute of Sport’s altitude training group isn’t very encouraging. They put eight runners through a sequence of two 3-week altitude training blocks using simulated altitude, to see if those who responded the first time were the same as those who responded the second time. The results: two runners got faster after both blocks; two runners got slower after both blocks; and the other four runners got faster after one block and slower after the other block. So even if you “respond” once, it’s pretty much a coin toss whether you’ll respond the next time.
So that’s one side of the argument. What about the opposing view, which in the MSSE debate was presented by French researchers Grégoire Millet and Franck Brocherie? It’s also convincing, building on a large pile of studies that find performance benefits from various types of altitude training protocol. I won’t go through it in detail, since that’s what most people already believe anyway.
The problem with the debate is that the two sides are basically arguing past each other. Siebenmann and Dempsey believe that the vast majority of altitude training studies are flawed, because they’re unblinded, subject to placebo effects, and sometimes with big differences in training load. Millet and Brocherie believe that, even if the data is imperfect, it overwhelmingly points to a benefit from altitude training.
Personally, if we were arguing about a brand new supplement, my reading of the evidence would probably follow Siebenmann’s: I’d have a hard time justifying big expenditures of time and money on the basis of the existing evidence. But I have a hard time disregarding the nearly unanimous verdict of elite endurance athletes around the world. We all have the capacity to fool ourselves now and then, and some of us enjoy being fooled more than others—but it’s hard to sustain a massive delusion about the quantifiable benefits of a training technique across many countries and many decades.
Without presuming to pronounce a final verdict, I guess my thinking for the moment is something like this: altitude training works. The theory is sound, and lots of people swear by it. But actually getting the numbers to line up for an individual athlete is far trickier than most people realize, and the illusion of success is probably helped by a bunch of other benefits that training camps provide. So if you get a chance to spend a month training in Flagstaff or St. Moritz, I’d grab it in a heartbeat. But if you get a similar opportunity in some idyllic training mecca closer to sea level, I’d grab that too, because what’s in the air may not matter as much as we thought.
For more Sweat Science, join me on Twitter and Facebook, sign up for the email newsletter, and check out my book Endure: Mind, Body, and the Curiously Elastic Limits of Human Performance.