CSA Statement on 2025 Collagen Meta-Analysis
Frequently, when a new eye-catching study is published, it can make one wonder if there’s a choreographed pattern or an agenda that is pre-set. Sometimes, the findings may misinterpret important data or overstate an actual impact, underscoring the importance of careful analysis and clear presentation of results in research. This is something that is increasingly forgotten in our clickbait-driven, sensationalized society.
This past May, a study from Myung and Park was the latest spark attempting to light fires, with LinkedIn commentary claiming it was the final nail in the coffin for the collagen industry. But before we jump to conclusions, let’s all take a deep breath and agree that sensational claims need solid data. Let’s dive in and check a few facts.
While the overall analysis does find collagen supplementation is beneficial and supportive for skin health parameters, which agrees with the overall body of literature, the attention-grabbing claim in the Myung & Park paper is their subgroup analysis. Here, the authors suggest collagen supplementation only shows positive results in studies funded by industry – a serious statement, and one that deserves scrutiny. Unfortunately for the authors, it doesn’t take long for cracks to begin to emerge, and when we start at the very source of their arguments, funding classifications, the data doesn’t hold up.
For instance:
- Choi et al. (2014) was flagged as industry‑influenced, but in reality received no direct funding. The study did use collagen peptides supplied by Amorepacific Co. (via their R&D center), and frankly, that’s not a problem. When industry provides materials (rather than cash), it can actually enhance reproducibility by ensuring future researchers have access to the same test substances.
- Sugihara et al. (2015), Inoue et al. (2016), Tak et al. (2021), and Seong et al. (2024) were all marked as independent, despite clear company affiliations or support.
When you’re basing conclusions on funding bias, getting the funding wrong in multiple studies undercuts the core argument, and it’s not a minor technicality, especially if you cite five (5) studies that are not industry influenced and two-thirds of them in fact are.
There are also several data inaccuracies that raise questions about the overall reliability of the meta-analysis:
- Yoon et al. (2014) was reported in the meta-analysis as using 0.75 g of collagen per day, however, participants actually consumed four capsules of hydrolyzed collagen per day, resulting in a total daily dose of 3 g.
- Lin et al. (2021) was reported in the meta-analysis as using 50 g daily; however, it was actually 50 ml of a liquid collagen drink containing about 5.5 g of collagen peptides.
- Guadanhim et al. was labeled as bovine collagen only, though the source was a bovine/porcine blend.
- Seong et al. (2024) was reported in the meta-analysis as using a dose of 2.5 g /day of collagen. However, upon confirmation with the ingredient composition of the original article, only 2 g of low molecular collagen peptide obtained from fish scales were used.
- Lee et al. (2023) was reported in the meta-analysis as using a dose of 1.65 g fish collagen per tablet, based on the abstract and conclusion of the original article. However, upon reviewing the study schedule, the original article indicates that participants were instructed to consume four tablets containing 1.65 g of collagen peptides, once daily. This suggests a total daily dose of 6.6 g, not 1.65 g as originally reported.
Additional sourcing and duration errors add to the picture:
- Koizumi et al. (2018) misidentified the collagen source as general fish collagen, which can be a blend of type I and II collagen, when it was specifically from fish scales, a type I only source.
- Bolke et al. (2019) was described as a 16-week intervention, but the actual supplementation period was 12 weeks, followed by a 4-week observation.
The inconsistencies in funding, dosing, sourcing, and duration not only cloud the interpretation of the results, but they also compromise the study’s conclusions. It’s important to recognize that industry-funded research is a major contributor to the scientific landscape, with estimates indicating that industry funding accounts for 50-70% of all dietary supplement research.12 Additionally, in their subgroup analysis, the authors imply that industry‑funded studies are inherently lower quality. But the actual Jadad scores tell a different story. Out of the industry‑supported trials, only one was open‑label and non‑blinded3, and a second was one point under the high‑quality threshold with a Jadad score of 2 (on a 1–5 scale that rewards proper randomization, blinding, and follow‑up). Every other industry‑backed study scored 3- 5, well within the range of rigorous clinical research. If anything, these scores suggest that the industry‑funded work in this meta‑analysis was at least as methodologically sound as its “independent” counterparts. Therefore, when the authors claim that all industry-funded research is biased and inaccurate, it is quite disingenuous.
The presentation of this meta-analysis highlights why the Collagen Stewardship Alliance (CSA) is a strong advocate for third-party certifications that guarantee ingredient quality and doses that are backed up with human-controlled trials. CSA fully supports the Collagen Verified program under the NutraStrong™ brand implemented by SGS-Nutrasource. This program consists of a comprehensive document review process to elevate ingredients and brands, verifying source, dose, and adequate human research.
Let’s all then agree, one meta-analysis is not settled science. More research is certainly needed, but in the meantime, we’ll keep supporting the collagen ecosystem, with higher standards, less bias and more transparency.
