Decode the Science Without Reading the Fine Print
Academic articles make my eyes glaze over. I can usually push through a few sentences, maybe the full abstract if I have a bit of caffeine in me. Reading them aloud can help, but I sometimes still go into “robot mode,” saying the words on the page while my brain is making up theories about whatever book I’d rather be reading.
Let’s be honest: academic articles aren’t exactly something I’d curl up with in my comfy chair while snuggling my cat. But working as a technical writer at a company that champions open science, I’ve had to dive back into the scholarly deep end—sometimes even reading papers that evaluate other papers (yes, really).
That first plunge involved a lot of caffeine and more frustrated sighs than I’d like to admit, but I came out the other side with a blog post and a surprising new appreciation for academic rigor—and a much better sense of how to navigate research in my daily life.

Since then, I’ve written a few more blog posts, and while I’ve learned how empowering it is to be able to evaluate research for myself, it’s not something I tend to do in my personal time.

But, that begs the question: if I’m not willing to dive into the weeds (and caffeine) every time I see a new study referenced on my recommended page, how do I know I’m not being duped? As someone who has, more than once, checked for “gullible” written on the ceiling, I’ve learned from experience that being tricked is not a great feeling, nor is it a healthy way for society to function when it happens on a broader scale. It’s been so heavy on my mind that I made a video on the subject of “magic studies” for the DUTCTV YouTube channel.
In the video, I conclude by saying that “the best way to fight the magic study [is to] make science available for everyone to see, In the video, I conclude by saying that “the best way to fight the magic study [is to] make science available for everyone to see, test, and improve.” A lovely sentiment, right? In an ideal world, every study or nugget of scientific data would be cited in every video or post that references it, and everyone who sees it would have the time, ability, and bandwidth to read the study in full before forming an opinion.
That’s about as likely as my cat writing her own academic paper. On top of being unlikely, the expectation is also inaccessible for many people. From just the standpoint of internet accessibility, my phone handles Instagram way better than it handles a 200-page PDF from a university server. Even if the link to every referenced research paper was readily available, it may not be feasible to open and read it on every device.
So, what’s a more reasonable, accessible solution?
Let’s revisit a concept I’ve mentioned previously: information literacy. According to the American Library Association, information literacy is "the set of integrated abilities encompassing the reflective discovery of information, the understanding of how information is produced and valued, and the use of information in creating new knowledge and participating ethically in communities of learning."
Let’s break that down. Information literacy means finding out…
- Where did the information come from?
- Who made it? Why?
- What do the creators want to do with it?
In the wild world of the internet, consumers of content have a complication: not only do they have to have information literacy when it comes to the research being referenced, but they also have to understand the background and motivations behind the person who posted or shared it. Sometimes, their goals can align, but it’s also not uncommon for research to be twisted to fit narratives online, intentional or otherwise. (There’s a full study on this phenomenon here, or read a breakdown of the facts in The Harvard Gazette.)
We have a lot of work to do. Let me break this into a how-to guide:
How to not get caught looking for “gullible” on the ceiling when you’re just trying to have fun online
Step One: Your Red Flag Checklist
A lot of posts online dangle certain phrases to bait you into clicking or engaging. On most websites, just clicking or tapping a post counts as engagement, which will boost the post’s popularity and spread it to more and more people.
Here are some common red flags to look out for to save you time:
- “Scientists are stunned” - If the scientists were stunned, they would say so themselves.
- ALL CAPS OR EXCESSIVE EMOJIS 🔥🚩🔥 - The facts should speak for themselves. There’s no need to shout them or decorate the information; it’s likely that the creator is using this to draw your attention without any real substance to offer.
- “This food is slowly killing you,” or “Doctors are hiding this cure,” or “What the government doesn’t want you to know” - Fear- or shock-based framing is meant to jolt your system into feeling like you need to learn more for your own survival. Paranoia and anger perform well in any algorithm.

And here are some yellow flags that signal for you to proceed with caution:
- “Recent study shows” - When? What kind of study? Without an immediate follow-up of sources, there’s no way to know if this study even exists, or, if it does, how good it was.
- Snippet of data or a quote with no context - While this may be harmless, without the full picture, it’s easy to warp the meaning of information to fit a narrative.
- Buzzwords that sound science-adjacent - Think things like “detox” or “natural remedy.” Your body can detox, and there are plenty of helpful natural remedies. But these can also be a way to sell unregulated products or practices that might work. But, without any real research to back them up, it’s often best to steer away from them.
Step Two: Hunt Down the Source
You are a detective, and you always carry your little magnifying glass with you. Luckily for you, most browsers come equipped with a search feature just for you. Conveniently, the button to make it work also looks like a little magnifying glass.

But, seriously, Google is a great first step. Between your basic searching options and new AI tools that will try their best to help, your first job is to find the study mentioned.
Ideally, the creator will cite their sources, but from my experience using social media, that’s pretty rare. Even when people ask directly for a source, the creator might not provide one, or they might not have one at all. So, pull out your magnifying glass and get to work.
Step Three: The How
The word “study” is a very broad term—a study can be anything from a ten-year research project to a survey sent to ten people in one Facebook group. Even if you don’t read the full article, you should know the general size and shape of the study you’re evaluating.
Was it comprehensive or a small sample? Was it biased or double-blind? Was it dated or recent?
Before you even look at the results, just knowing the environment in which the information was gathered can be enough to dismiss the research or know it’s worth more digging. You can learn more about how to evaluate a study in Dr. Kelsi West’s video, “What does this research paper mean???”
Step Four: The Who and Why
Here’s where you split your investigation into two threads: who is behind the study, and who is behind the presentation?

When looking at the creator who’s sharing the information, the first thing to do is follow the money. Are they selling something related to this study? Are they getting paid to talk about it? Are there any sponsorships? Does the post advertise for a platform where they talk about it in more detail?
And, most importantly, look into their agenda: does the creator’s agenda line up with or directly oppose the study? Are they trying to change your mind on a particular topic or ideology?
While everyone certainly has biases and is free to share their opinions, it’s important to be aware when biases exist so we can sort through them to find the truth. In the end, you get to make up your own opinion about the information presented. But you should have all the facts in front of you before you make up your mind.
Step Five: The What
Look at that! We’ve made it to the last step without even needing to read the article.
So, how do you figure out what it’s actually saying without having to read every word?
Start by looking for a summary from the original source. Many scientific papers include an abstract that lays out the goals, methods, and conclusions in a few paragraphs. If that’s too technical, try these tricks:
- Check trusted science communicators (e.g., NASA, NIH, major universities, or known science outlets) to see if they’ve already broken it down.
- Search for press releases or blog posts from the research team. These often explain the findings in plain language, sometimes even with diagrams.
- Use tools like Semantic Scholar or Explainpaper to get simplified summaries.
- Look for coverage in multiple mainstream outlets. Make sure you compare a few, and watch out for dramatized takes.
Then, ask yourself:
- Does this match what the original post or video claimed?
- Is the conclusion being exaggerated, oversimplified, or taken out of context?
- Do you feel like you understand the key takeaway well enough to explain it?
Wrap-Up
Reading research doesn’t have to mean pushing through tough jargon or trying to get your eyes to focus. You don’t have to be a scientist to ask good questions, spot red flags, or track down what’s really going on beneath the pretty post. Information literacy isn’t about being perfect: it’s about being intentional.

For the sake of transparency, here's my motivation for this post: Don't Use This Code offers a free Open Science Skills Training to help anyone—researchers, writers, lifelong learners—acquire tools to ask better questions and trust science for the right reasons. It’s accessible, beginner-friendly, and full of things I wish I’d known when I first started digging into research.
Now I’m going to pull you onto another platform where we can talk more in the future:
What red flags do you watch for with research? Have a favorite myth-busting moment or a source you always turn to? Come tell me on the DUTC Discord—we’re always swapping tools, tips, and hot takes about science in the wild.
Until next time, happy reading!