top of page

How to Detect BS in a Research Manuscript: The Example of Prevagen Part II – Looking at the Paper

Updated: Nov 23, 2023




In our last blog, we used Prevagen as an example of how to evaluate the efficacy of a supplement, and specifically what sources are generally trustworthy (e.g. independent organizations such as Consumer Reports) and which are not (e.g. the company’s own website, Amazon reviews, social media). In the next two blogs, we will go directly to the source, the sole published research study of Prevagen in humans, and take a step-by-step approach to see whether or not Prevagen’s claim, “clinically proven to improve memory,” should be believed. The questions below can be used similar to the Medical Bullshit Detector—if you get one or more “no”s to these questions, there’s a fair chance something is amiss.

  1. Are there any peer-reviewed scientific publications testing this supplement in people? Yes. This is promising. Unlike many of the supplements on the market, Prevagen has actually been studied in people and has one publication.(1) The journal, Advances in Mind-Body Medicine, does peer review, meaning that when an author submits a paper the journal will send it out to two or more experts in the field to review it for validity and significance. Although peer review is an imperfect process, it does sometimes catch BS before it spreads. As noted in a prior blog, it it’s not proven in people, it’s not proven to work.

  2. Is the reputation of the journal consistent with the claims of the supplement? No. For a supplement claiming to be clinically proven to improve memory one would expect it to be published in a well-known and high impact journal. The Impact Factor, a way of looking at how often articles in a journal are cited by other scientists, is one way of measuring this with higher numbers meaning more impact. Advances in Mind-Body Medicine, where this article was published, has an Impact Factor below 1 (as a rough guide, <1 is very low impact, 1-5 moderate, 5-10 high and >10 exceptional).

  3. Was the clinical trial registered and reported with clinicaltrials.gov? No. Clinicaltrials.gov was started in the early 2000’s as a way to improve clinical trial reporting. By requesting researchers register their trial before the first person is enrolled, clinicaltrials.gov can help prevent the problems of researchers/companies not reporting results from negative trials or changing their analysis plans after the data is in to make their study look more favorable.

  4. Was the study conducted by independent scientists without serious conflicts of interest? No. All of the authors worked for Quincy Bioscience, the company that makes Prevagen. This is a HUGE red flag. Conflicts of interest–meaning conflicts between the duty to do good science and any other motivator–do not get any bigger or more obvious than this.

  5. Have the results been replicated (ideally by an independent group of scientists)? No. One of the core principles of science and the scientific method is that the main outcomes of a researcher’s findings should be obtainable by a second researcher using the same methods. Failure to replicate findings could suggest that either bias or chance contributed to the first researcher’s findings. Failure to try to replicate can happen for many reasons, but when it comes to BS products, it generally indicates a desire to avoid uncovering any shortcomings in favorable results. It may also signal a fear that the results would not be replicated if done by independent scientists.

  6. Is there a strong rationale for the study? Partial, but mostly no. The authors do a good job of explaining why neuronal calcium (the purported target of Prevagen) may be related to memory issues in the aging brain and cite the work of several independent researchers. However, the only studies they cite supporting Prevagen’s ability to meaningfully impact neuronal calcium in animal/cellular studies were funded by Quincy Bioscience, and only one of the 3 was published in a peer-reviewed journal. Of note, the other two were abstracts presented at scientific meetings that were never published; this is a common way for many companies to give the appearance of scientific rigor without actually subjecting their work to scientific scrutiny.

BONUS QUESTION: Has the manuscript been retracted or seriously questioned?


The peer-review process is not perfect. Sometimes a shoddy or fraudulent paper will may make it through. At this point, and particularly if it attracts attention, it is up to the wider scientific community to point out its flaws and question its validity, even to the point where the publishing journal unpublishes (retracts) the paper. This may be more common than you think. In fact, there is a website, retractionwatch.com dedicated to monitoring retractions that has over 18,000 papers in its database. While the Prevagen paper has not been retracted, it has been seriously questioned, including by the FDA.


TAKE HOME POINTS:

  1. You can tell a lot about the quality of a research study by examining some surface level features of the publication.

  2. If a publication has big claims but is published in a low quality journal, and especially if it is not peer-reviewed, be wary.

  3. If the authors of the publication have serious conflicts of interest (e.g. they work for the company that makes the product) and it has never been tested by independent scientists, be on guard for BS.

  4. If the premise of the study is not supported by sound science (e.g. appropriate peer-reviewed publications showing the steps preceding the trial in people), be prepared to be underwhelmed.

We’ll get into the actual science in the next blog.





References:

  1. Moran DL, Underwood MY, Gabourie TA, Lerner KC. Effects of a Supplement Containing Apoaequorin on Verbal Learning in Older Adults in the Community. Adv Mind Body Med 2016;30:4-11.

*Image from: https://www.facebook.com/realfakescience/ (a group doing what sound like fun and engaging presentations on distinguishing real from fake science)

bottom of page