The supplement industry has a trust problem. It's built on half-truths, cherry-picked studies, and clinical findings presented wildly out of context. So when we talk about what our products do, we wanted to be different. Transparent. Accountable. Here's what the science actually shows.
Why transparency matters in supplement science
Most supplement brands cite research the way politicians cite statistics. They find the one favourable study, present the biggest number, and hope you don't dig deeper.
The problem is that when you do dig deeper, the picture often falls apart. The sample size was tiny. The funding came from the company making the product. The follow-up studies contradicted it. Or the effect was so small it barely registers in real-world terms.
We started Organised on a different premise. If we were going to cite a study, we needed to understand it completely. Not just the headline result, but the methodology, the limitations, the quality of the evidence, whether it could be replicated, and whether the effect was actually meaningful to a human being trying to get healthier.
A statistically significant result isn't the same as a clinically significant one. We only talk about findings that matter to actual health.
How we evaluate clinical evidence
When we look at a study, we ask a specific set of questions.
First, was it properly designed? A randomised, controlled trial with appropriate blinding is far more trustworthy than an open-label study or a retrospective chart review. We weight the evidence accordingly.
Second, how many people? A study with 500 participants carries more weight than one with 30. And did it measure what actually matters? A study showing that a nutrient changes a blood marker is interesting. A study showing it changes how you feel, or how your body performs, matters far more.
Third, was there an appropriate control? Did they compare the supplement to placebo, or to another treatment? Or did they just measure everyone before and after with no control group at all? The quality of the comparison completely changes how much you can trust the result.
Fourth, who funded it? We're not saying that industry-funded research is always unreliable, but the incentive structure matters. We look harder at studies funded by the company selling the product.
Fifth, has anyone else replicated it? One study, no matter how well-designed, is interesting. The same finding confirmed in multiple independent studies? That's evidence we can trust.
The collagen and skin elasticity data
One of the most commonly cited claims in the collagen space is that supplemental collagen improves skin elasticity. We cite this research ourselves. Here's what it actually shows.
There is a real body of evidence that collagen peptides, taken consistently over 8 to 12 weeks, can improve measurable markers of skin elasticity and hydration.1 The effect shows up in multiple independent trials with reasonable sample sizes, typically 50 to 100 participants per arm.
The specific number we reference, a 52.3% improvement in skin elasticity, comes from a controlled trial published by a legitimate research institution. The methodology was sound. The control group received a placebo. The researchers measured skin elasticity objectively using instruments, not subjective questionnaires.
But here's what matters: that 52.3% doesn't mean your skin will look 52% firmer. It means the measurable elasticity parameter, tested in a lab, improved by that amount. And it took 12 weeks of consistent supplementation, combined with adequate protein and vitamin C intake, which are cofactors for collagen synthesis. If you take it for two weeks and expect visible changes, you'll be disappointed. If you take it consistently as part of a broader nutritional foundation, the research suggests you'll notice improvements.
The research is genuinely promising. But it's promising over months, not days, and only if you're actually supporting collagen synthesis with the nutrients your body needs to use it.
The colostrum and gut health findings
Bovine colostrum has shown remarkable effects in research. The 31% reduction in bloating that we cite comes from a properly designed randomised controlled trial measuring functional outcomes. People actually felt better, not just better on a lab test.
But and this matters the research is preliminary in some areas. We have good evidence for colostrum's effects on gut permeability and inflammation markers. We have decent evidence for improvements in athletic recovery and immune function in stressed populations. We have promising but less conclusive evidence for broader digestive improvements and for effects on the microbiome composition itself.
When we talk about what colostrum does, we stick to what the research actually supports. Not what we wish it did. Not what it might do with more research. What it demonstrably does right now, in studies we can read and evaluate.
What the research actually means
Here's the thing about supplement science that nobody talks about. An improvement in a blood marker is not automatically an improvement in how you feel. A statistically significant change is not the same as a clinically meaningful one. And a study showing something works in a test tube is not the same as showing it works in a human being.
We know this. We try very hard not to confuse these things.
When research shows that a nutrient improves insulin sensitivity in a fasting state, that's useful information. But if you're already eating well and moving your body, you might never see that benefit. The studies often enrol people with measurable deficiencies or health problems, not people already optimising their health.
This is why we always pair supplement research with food context. Yes, collagen peptides improve elasticity markers. But so does eating actual gelatinous meat, bone broth, and adequate protein. The research tells us the mechanism works. Real food gives you the dose, plus the cofactors, plus the full nutrient matrix your body actually uses.
Additionally, research in controlled settings often shows larger effects than real-world application. The participants in studies are compliant. They're taking the supplement consistently. They're often in otherwise good health. Real life is messier. People skip doses. They don't manage stress. They're dealing with chronic health challenges. The benefit they experience is often smaller than the published study effect size.
The studies we're still waiting for
Honest science requires being honest about the gaps.
We don't have long-term safety data on freeze-dried organ meats in humans. We have reasonable data that whole organs are safe to eat, because humans have eaten them for millennia. We have data on the specific nutrients they contain. But a formal 10-year safety trial in 500 people? It doesn't exist, and it probably won't, because there's no financial incentive for anyone to run it.
We don't have direct comparison studies between organ supplements and eating the actual organs, using real food as the control. We can infer from the nutrient composition and bioavailability literature, but we'd like the direct comparison studies to exist. They probably won't, for the same reason.
We don't have large-scale, long-term studies on whether organ supplementation improves markers of longevity or prevents age-related disease. We have mechanistic evidence suggesting it should. We have shorter-term studies showing improvements in specific parameters. A 20-year trial following 1000 people? That's not happening.
Rather than pretend these studies exist, or make claims that exceed what the evidence supports, we work with what we have. And we remain transparent about the limitations.
The absence of a study doesn't mean something doesn't work. It usually means nobody funded the research yet.
Why food source matters more than the nutrient alone
Here's perhaps the most important finding in nutritional science, and it rarely gets communicated clearly.
A nutrient taken in isolation behaves differently than the same nutrient within its food matrix. The presence of cofactors, synergistic compounds, fibre, and other nutrients changes absorption, efficacy, and how your body actually uses what you consume.
Retinol from beef liver is absorbed and used differently than synthetic retinol acetate or beta-carotene from a supplement.3 Heme iron from red meat is absorbed differently than non-heme iron from plants or isolated supplements.2 Magnesium from food is handled by your body differently than magnesium glycinate taken in capsule form, even though the elemental magnesium is identical.
This is why we built our products from whole food sources, not isolated nutrients. The research on supplement bioavailability is genuinely interesting, and it consistently shows that whole-food delivery systems outperform isolated nutrients. Your body evolved to process food, not powders. When you work with that biology instead of against it, the research shows better outcomes.
Building a sustainable supplement strategy
We get asked all the time about the right supplements to take. Our honest answer: it depends entirely on your current nutritional status, your diet, and your health goals.
For most people eating reasonably well, a multi-nutrient supplement from a whole-food source covers the gaps. Specific targeted supplementation should come after blood testing reveals an actual deficiency. You wouldn't take insulin if your blood sugar is normal. You shouldn't supplement a nutrient if you're not deficient.
The most important supplement anyone can take is food. Organ meats like liver and kidney. Bone broth. Eggs from pastured chickens. Grass-fed butter. Fish. Shellfish. These foods contain the nutrient densities and cofactor synergies that no supplement can fully replicate.
When you've optimised food first, and testing shows gaps, then supplementation makes sense. That's when a product like ours, built from whole-food sources, actually delivers measurable benefit.
Why we test everything we cite
One question we get frequently: how do we know the studies we cite are actually accurate? The answer is that we read them. All of them. Not just the abstract, but the full methodology, the data, the limitations section.
This process is time-consuming. A single clinical trial can be 50 pages long. Understanding what the researchers actually found, what they measured, what they controlled for, requires genuine engagement with the text.
We don't rely on press releases from the researchers or on secondary summaries from other supplement companies. We read the raw data. We check the statistics. We look at who funded it. We ask whether the finding makes biological sense.
This is why we sometimes cite fewer studies than other brands. We only cite findings that survive this scrutiny.
For our collagen claims, we've read every study cited in meta-analyses on collagen and skin elasticity. For colostrum, we've tracked down the primary research from the researchers who did the work. For organ meats, we've reviewed the biochemical literature on nutrient composition and compared it to food composition databases.
This doesn't mean we're never wrong. Science evolves. New studies contradict old ones. But it means we've genuinely tried to get it right, and we're willing to update our claims when the evidence changes.
The research landscape is incomplete
Here's something the supplement industry doesn't like to talk about. There are enormous gaps in the research. There are nutrients that have been consumed safely for thousands of years, but that have never been formally studied in humans in ways that would meet modern clinical standards.
Take colostrum. It's been fed to animals and consumed by humans across cultures for millennia. The safety profile is obvious. But how many double-blind, placebo-controlled studies on colostrum in humans exist? Fewer than you'd think, and most are small.
Take organ meats. We have thousands of years of ancestral consumption proving safety. We have biochemistry showing the nutrient profiles. But do we have 20-year human longevity studies comparing organ meat consumers to controls? No. And we probably never will, because there's no company with an incentive to fund a 20-year study on a food that costs three dollars per serving.
This isn't a failure of science. It's a feature of how research funding works. Studies get funded when there's money to be made. Sometimes that's from supplement companies. Sometimes from governments. But often, the most important nutritional knowledge goes unstudied because nobody stands to profit from the knowledge.
We try to be transparent about these gaps. We cite what exists. We acknowledge what doesn't. And we make recommendations based on the best available evidence, knowing that evidence is always incomplete.
How evidence levels matter
When we cite a study, we try to acknowledge its place in the hierarchy of evidence. A large randomised controlled trial with thousands of participants carries more weight than a small observational study. But both have value.
For example, we can't run a randomised controlled trial saying "eat nose-to-tail for 30 years and see if you live longer." We don't have the funding or ethical permission. But we can look at cultures that have historically eaten nose-to-tail, examine their health outcomes, and note the correlations.
We can run controlled trials on specific nutrients, like the collagen elasticity study. We can measure specific biomarkers, like inflammation or mineral levels. These studies have clear limitations, but they're precise.
Good science uses all of these sources. Historical data. Biochemical understanding. Controlled studies. Observational data. Together, they paint a picture. Individually, each one has limitations.
The bottom line
Science is a tool for understanding how things work, not a marketing device. When you see a supplement brand citing research, it's worth asking: Did they cherry-pick the single best study, or are they acknowledging the whole body of evidence? Are they honest about limitations? Have they tested their own product against the claims they're making?
We've tried to do this work properly. Not perfectly, because the research landscape is incomplete, and science is always provisional. But honestly. We read what exists, we understand what it shows, and we tell you what that means for your actual health. That's what transparency in supplement science should look like.
References
- 1. de Miranda RB, Weimer P, Rossi RC. Effects of hydrolyzed collagen supplementation on skin aging: a systematic review and meta-analysis. International Journal of Dermatology. 2021;60(12):1449-1461. See also Pu SY et al. Effects of Oral Collagen for Skin Anti-Aging: A Systematic Review and Meta-Analysis. Nutrients. 2023;15(9):2080. https://pmc.ncbi.nlm.nih.gov/articles/PMC10180699/
- 2. National Institutes of Health, Office of Dietary Supplements. Iron: Fact Sheet for Health Professionals. https://ods.od.nih.gov/factsheets/Iron-HealthProfessional/ [accessed May 2026]. Heme iron absorption: 15-35%; non-heme iron absorption: 2-20%, with significant modulation by ascorbic acid and inhibitors such as phytates and polyphenols.
- 3. National Institutes of Health, Office of Dietary Supplements. Vitamin A and Carotenoids: Fact Sheet for Health Professionals. https://ods.od.nih.gov/factsheets/VitaminA-HealthProfessional/ [accessed May 2026]. Discusses retinyl ester, retinol, and provitamin A carotenoid uptake; conversion ratios for beta-carotene to retinol vary substantially by individual genetics and food matrix.
- Science & ResearchCollagen and Skin Elasticity: A Meta-Analysis of the EvidenceDeep dive into the 52.3% skin elasticity improvement. Methodology, sample size, and what realistic expectations should be.
- Science & ResearchColostrum and Bloating: What the Abrahams et al. Study FoundBovine colostrum has been shown in research to reduce bloating symptoms by 31%. Here's what the Abrahams et al. study revealed about mechanisms and practical use.
- Science & ResearchCollagen and Hair Density: Breaking Down the Reilly et al. StudyWhat the Reilly et al. research really shows about collagen and hair density. Why the 27% increase matters, and what to realistically expect.
Nourishment, without the taste.
If you have questions about any of the research we cite, we'd genuinely like to hear them. Science thrives on scrutiny.


