A Brief Guide to "Guru Syndrome" and Separating Signal from Noise
On being a better consumer of information in an age of nonsense
The internet is many things; but perhaps foremost, it is a noisy place with countless people vying for attention. Aside from easier-to-spot grifters imploring you to eat raw liver or dunk your face in a bucket of ice water at 4 AM, popular science and self-help content is everywhere.
Some of it is smart and informative. Much of it is not.
You don’t want to become come cynical, but you’ve got to be at least a little skeptical.
One of the most frequent questions I’m asked is some version of, There is so much information out there, and much of it conflicting; how do I know whom to trust?
Here’s a simple guide to help. If you spot any of the following—or worse yet, all of it—proceed with caution.
1. Unnecessary complexification:
Popular science and self-help should not sound like academic writing. If someone is using broad, complex-sounding, and ambiguous terms—e.g., toxins, detox, mitochondrial health, mTor pathways—it could be a sign they don’t have much substance to offer.
When people lack genuine knowledge, they often hide behind endless complexity in an attempt to sound smart. Beware of people who develop their own sciency sounding vocabulary for whatever it is they are selling. Real expertise tends to be simple and accessible.
2. Proprietary, on sale, and expensive:
If someone is pushing a product, program, or “proprietary” method, skepticism is warranted. Science is supposed to be dispassionate. There are usually many roads to Rome. There is almost never a single, let alone proprietary, answer to any significant health or performance problem. And in the rare cases there is, it’s because a medication has been through clinical trials and remains on patent. Otherwise, there are no secrets.
Every influencer or podcaster wants to reach an audience. Every writer wants their ideas to spread, myself included. But there’s a difference between wanting to spread one’s ideas and wanting to build a copyrighted empire to the exclusion of all else.
3. Framing everything as “us” vs “them”:
Conflict generates attention and riles up emotion. If someone is using contention and tribalism to sell an idea, then the idea probably isn’t strong enough to stand on its own.
You see this all the time in the world of diet, fitness, and behavioral change. If the method is strong and probably true, you shouldn’t need to fabricate battles and boogeymen to sell it.
4. Precise and narrow interventions with massive impacts:
Be cautious of advice that promises big results across many domains. Few things have such broad impacts. (Exercise is one of them. Most aren’t.)
For example: The flu shot is effective for the flu. The flu shot is not effective for every single illness. Saying some new, specific concept or approach can be precisely applied to a multitude of things and make a big difference is usually overreaching.
5. Over-reliance on emotion and story:
Stories are powerful. But they should illustrate the science, not replace it. If a narrative drives the point more than data and history, it’s worth pausing and questioning. Data is not the plural of anecdote.
Consider my field of non-fiction writing. The best books—the ones that endure over time and contain truths with a capital “T”—start with strong theory and data and then use stories to bring the reader along. The books that crush for a short period of time and turn out to be fraudulent, phony, or hype; they start with alluring stories and then fabricate or cherry pick data to support the narrative.
6. An inability to cite one’s sources:
If someone is making research-backed claims, they should clearly show their sources and be open to scrutiny, especially if their credibility depends on being scientific or evidence-based. If someone cannot entertain how they might be wrong, then they are drifting into religious belief, which can be fine, but is very different from critical thinking.
7. Contrarian on everything:
Sometimes the prevailing wisdom is wrong. Most breakthroughs are at first deemed crazy. However, if someone is a serial contrarian—if they assume history, science, and practice is wrong on everything—that generally means the person is no longer thinking straight but instead addicted to the thrill of going against the grain.
8. Guru syndrome:
If someone lacks humility, never addresses how they might be wrong, and reaches far beyond their expertise in providing answers for everything, that is a massive red flag.
We live in an age where online gurus regularly weaponize their audience against anyone who questions their ideas. No one person is an infallible source of all answers.
Now that you know what to look out for, here are some questions you can ask yourself when consuming advice on the internet:
Is the idea backed by a strong theory that makes sense in simple terms?
Is the idea supported by empirical evidence, and ideally not just a few small rodent studies but many significant ones?
Does the idea appear in different contexts throughout history, and does it also appear in modern practice? Are there solid patterns that emerge?
When all three show up, that’s a good sign that you’re on firm ground.
Brad - Some people who practice science need to talk to people in precise, sometimes new terms. Mitochondria will be one of the most studied aspects of cellular health in the coming years. Should talk about it as we do "homeostasis" - oh just some general concept that means balance vs. the complexity of the ECS? Education involves change (i.e. learning new things) and if the public needs to elevate their vocabulary or knowledge, well that should be the science writer's goal. Good article but that first point comes off dismissive many writers I am sure.