Quantcast
Channel: Training Archives - 3D Muscle Journey
Viewing all articles
Browse latest Browse all 103

Filled with Science, but Unscientific

$
0
0

I hate to say it, but many of my fellow science-interested lifters who throw up #evidencebased in their posts are the perfect ammunition for the eye-rolling bros who view the whole concept of a scientific approach to lifting with disdain. The disdain these bros have often doesn’t actually come from science, but rather from the people who throw science in their faces online. You know the stereotype: the overthinking, spreadsheet using, macro-tracking, PubMed ninja who always manages to point out logical fallacies, but never manages to make gains.

Here’s the thing, I wish it always was just a stereotype. But speaking from experience, it’s sometimes closer to the truth than I’d like to admit. A lot of us (and I say “us” because I have resembled elements of this stereotype at times) operate from a place of anxiety. The technical term is FOMOOG (fear of missing out on gains). Many who are drawn to science are drawn to the perceived definitive nature of science, because they have a poor relationship with uncertainty. Anecdotes aren’t always trustworthy, so scientific publications become the safety blanket. Less than stellar progress leads to micromanaging and tracking more variables, reading more studies, and incorporating more and more science because of FOMOOG.

One day you look up and you’ve recently started an auto regulated daily undulating program guided by both RIR and your new velocity tracker, you’re doing PAP top sets before your back off sets on main lifts, BFR on your single joint accessories, myo-reps on your compound accessories, autoregulated deloads dictated by a combined scale averaging your session RPEs, HRV scores from your new Fitbit and 1-10 soreness ratings, and you’ve got a fast decay exponential taper planned for the end of the block when you want to test AMRAPs and 1RMs. When you send your program to your friends you have to include a page on your spreadsheet with definitions for all the acronyms. You also just recalculated your macros and started taking creatine, fish oil, beta alanine, citrulline malate, caffeine, vitamin d3, a multivitamin, and you’re trying to eat more high-nitrate fruits and vegetables. Sheesh, you’ve barely had time to make fun of people for training with body part splits this week with all of the planning, reading, purchasing, and implementation! But it was well worth it, now you can cite studies for each of these decisions. You’ve truly earned that “evidencebased” hashtag.

Obviously, this is a bit facetious on my part and a caricature not representative of any large segment of our community. But just barely. For some of us, if we’re honest, we can admit it’s also uncomfortably close to the truth of where we are, or at least where we have been. So, what’s the problem with the above? Hell, some of you might be thinking “Wait, I do a lot of that, are those things no longer evidence-based, should I not be doing those things?” Before you rush to open another window to see if a new meta-analysis came out disproving one or more of these strategies, just chill, and keep reading. The issue is not with the strategies, but with how they are implemented.

Ask yourself this, why aren’t there any studies where all, or even more than a few of these strategies are studied simultaneously? The savvy reader probably realizes that for a study to be valid, it has to control for confounding variables, and scientists typically make only one variable different between groups in order to isolate its effect. That’s the scientific method in a nutshell: change only one thing, observe what happens. That’s also what some of the bros who think they have a disdain for science do. The bros frustrated by the caricature above, who make gains or coach successfully by focusing on “what works”, are using an inherently scientific approach without realizing it. Their plans might be simple, and might not be filled with science, but they are absolutely using the scientific method. Complexity is not the same as being scientific, and more often than not, complexity serves to obfuscate rather than optimize.

As a coach, I learned from being a scientist. Scientists design studies to answer a question and isolate the outcome. When a good coach troubleshoots, they do the same. A bad coach on the other hand, throws the kitchen sink of all the possibly beneficial strategies at a problem simultaneously. Even if doing so works, they don’t know why. It’s not reproducible, they didn’t become better at solving a specific problem. They didn’t update their pattern-recognition algorithm we call “experience” by finding a specific solution to a problem that might be of use in the future. The bad coach in this example is really not a coach at all, but a collection of knowledge. They don’t have skill, which is developed through experience, which they aren’t gaining from their approach.

There is another layer to this discussion, beyond just “change one thing at a time”. Earlier I said that many who are drawn to science are drawn to its perceived definitive nature. If you can identify that in yourself, it’s important to always hold close what science is at its core. Truly, science is about embracing uncertainty. The statistics used to infer study outcomes are embedded with probability not certainty. Science is constantly self-correcting and evolving by design, and it can only do so by being appropriately uncertain about its findings. If we apply inappropriately high levels of certainty to findings that are actually still uncertain, there would be no impetus to refine a field. Science is a process of being continually less wrong, not a process of finding truths.

This is an easy notion to nod your head at, but a difficult thing to truly internalize. Have you ever struggled to let go of one of the many scientific safety blankets in your training or nutrition you held dear once data came out showing it didn’t work, or didn’t work as well as you once thought? I have. I thought high protein diets would have a large effect on lean mass retention while dieting, that RPE would have a large effect on strength gains when used to auto regulate load, and that refeeds and diet breaks would have a large effect on body composition and mitigating the effects of low-energy availability. Right now, the data leans towards a small effect at best in each case. But with that said, these are emerging fields. It’s also possible that more data might show those effect sizes are actually medium, neither small nor large.

That’s not a comfortable mindset for someone who craves certainty, and it takes continued vigilance even for scientists to retain it. It’s even harder for those who put their ego on the line fighting about science on social media, or who promote a certain method, or worse, sell a certain method that science used to support.

Having been someone who’s lost a safety blanket or two, my first reaction isn’t a zen-like “the data are what the data are…ohmmmm”. My first reaction, every time, is a little bit of denial, anger, and frustration. It doesn’t last long, and I don’t often act on it, but the desire to throw my toys is absolutely there. But if I was to explode in a fit of resentment and lash out at science when my safety blankets turned out to be less effective than I previously thought, that’s the same thing as someone saying “Science? Ha! They don’t get anything right, last week they said x, this week they say y! I’m sticking with common sense!”

If you really want to be scientific, not only do you have to use the scientific method in your training versus as many scientific methods in your training as possible, but you also have to understand the very nature of science as a process of embracing uncertainty, rather than a safety blanket against it.

The post Filled with Science, but Unscientific appeared first on 3D Muscle Journey.


Viewing all articles
Browse latest Browse all 103

Trending Articles