Published on March 7th, 2019 | by Bonnie Patten2
MyPillow Sleep Study is a Snoozer
A randomized, double-blind, placebo-controlled (RTC) study is regarded as one of the most valued research methodologies around for examining the effectiveness of a product or intervention. And it’s also a great marketing tool. Companies love to tout their product as “clinically tested” or “clinically proven” in order to encourage consumers to part with their hard-earned dollars.
Unfortunately, TINA.org has seen a worrisome trend of companies playing fast and loose with what they claim to be clinical studies in order to promote marketing messages that simply aren’t accurate. The list includes L’Oréal and its clinically proven fountain of youth for skin; Capillus laser hats, clinically proven to regrow hair; Sensa, whose sprinkles guaranteed weight loss with a doctor’s studies as clinical proof; Pom Wonderful juice, clinically proven to treat, prevent, or reduce the risk of heart disease, prostate cancer, and erectile dysfunction; and, more recently, Prevagen, clinically proven to improve memory. The problem with all these assertions, however, is that the research simply does not support the marketing message.
Joining the bandwagon is MyPillow, which recently published a sleep study on its website that claims to be a “double-blind, randomized, placebo-controlled trial of consumer-marketed MyPillow.” The problem is that the trial, which MyPillow paid for, appears to have been an unmitigated disaster that ultimately failed to yield competent and reliable scientific results. Here are a few of the problems the researchers had:
- Initially 162 nursing home residents were recruited for the study but they had issues with memory recall so they were dropped as unreliable subjects.
- To take their place, Latino/Hispanic, Jewish, African-American, Chinese, Middle Eastern, and Russian populations, who were 50 years of age and older, and living in Brooklyn, New York, were targeted. Because a lot of Russians volunteered, the researchers decided to go all in, stating, “[W]e decided to draw our entire sample from participants of Russian ethnicity.” Good for the aging Russians of Brooklyn but how do their results translate to a more diverse population? When only certain portions of a population are considered, the results of a study may have poor validity.
- Then the Russians became uncooperative. As the study states, “Initially, we were paying participants after each stage; in that scenario many people accepted the first payment and then stopped.” Others refused to give up their pillows: “[M]any of the subjects who started stage one with MyPillow did not want to switch to the placebo pillow.” While others tailored their responses in an attempt to satisfy the researchers: “43 individuals were eliminated due to questionable integrity in their responses on the weekly questionnaires.”
As a result, the researchers did not include “any comparative results between MyPillow and the placebo,” which means the results of the study are neither randomized nor placebo-controlled. (In case you’re wondering, the placebo was not a foam pillow similar to MyPillow but a goose down pillow, though the exact size, fill amount, and brand is not disclosed in the published study. The researchers did indicate that they used a MyPillow Classic pillow supplied by the company, which leads one to wonder whether these pillows were imprinted with the standard MyPillow logo all over the casing.)
And there are other issues with the study:
- Where’s all the data? The study is long on results and short on backup. We’re told that 192 subjects completed the three-month study but are only provided with scant information about the study group and even less raw data to support the conclusions of the study. Characteristics of the patient cohort that are missing include demographics such as mean age and body mass index (BMI); there is no information concerning medications that participants were taking for conditions such as high blood pressure, anxiety, depression, ADHD, chronic pain, insomnia, and the like; and comorbidities such as insomnia, depression, and obstructive sleep apnea are not provided for the overall cohort. Moreover, whether the study population used pillows and if so, what type or types of pillow, prior to trying the MyPillow Classic, is not disclosed.
- Questionable conclusions. The study concludes, among other things, that “[a]pnea and hypopnea events were significantly decreased after one month of switching to MyPillow.” Yet, in the results section of the paper it states that “33% of the participants showed an adverse effect: increased hypopnic episodes.” And that’s not good.
- Questionable results: Results pertaining to changes in oxygenation and snoring failed to support statistically significant findings. In the case of changes in oxygenation, the probability that the findings were the result of chance could not be excluded (p value of 0.06), and with regard to snoring, the best the researchers could say about a reduction in snoring was that “[t]hese results showed a strong positive trend nearing statistical significance,” which means close but no cigar.
- Study published in an online magazine. Gold standard RCT studies with statistically significant findings that contribute to existing literature are generally published in peer-reviewed journals, meaning that experts in the field have reviewed the study for proper research methods and believe it to be worthy of publication. In this case, MyPillow’s sleep study was not published in a peer-reviewed journal but rather an online magazine – and even that was short-lived as the study is no longer available on that site.
I could go on but I think you get the gist of what I’m saying, which is that just because an advertiser says something is clinically proven doesn’t actually mean the advertising claim has been clinically-proven.
Update: this blog posted was updated on March 15, 2019.