By: Nabeela Khan
July 15 2024
On August 15, 2023, YouTube updated its guidelines on medical misinformation, stating that it will remove content that “promotes cancer treatments proven to be harmful or ineffective, or content that discourages viewers from seeking professional medical treatment.” Citing the example of videos promoting garlic as a possible cure for cancer, the platform added that such content will be removed.
However, almost a year later, we found that the platform is rife with misinformation about the disease. We reviewed over 150 videos and found at least 30 instances, with over 14 million combined views, where the content on the platform was promoting unproven and unscientific treatments for curing cancer. While most of these videos were as recent as a month old, some date back to 2022, 2019, and 2013.
Using Logically Facts®Accelerate, a tool that enables proactive discovery of fact-check-worthy content on platforms, we found nearly 75 videos that made claims specific to garlic. These videos, including Shorts, were uploaded between September 2023 (after YouTube revised its medical misinformation policy) and April 2024. Some common claims included ‘garlic can cure colon cancer,’ ‘garlic can reduce the incidence of stomach cancer by half,’ and ‘garlic may help reduce the risk of certain cancers'.
Screenshot of videos promoting alternative cancer treatments. (Source: YouTube)
A video uploaded on January 22, 2020, with over 730,000 views, promoted the use of garlic for the common cold and cancer. Another one uploaded on January 28, 2013, with nearly 290,000 views, talks about how garlic has flavonoids and can help protect against cancer.
Screenshot of videos promoting unscientific cancer cures on YouTube. (Source: YouTube)
These videos continue to be on the platform despite YouTube clearly stating in its 2023 policy statement that such content will be removed.
In response to our questions, sent along with a list of 30 videos, the platform said it removed one video. However, it was still available on the platform at the time of publishing this story. The video promoted graviola as a possible cure for cancer and had nearly 11,000 views. The platform added, “Upon review, we blocked ads on several videos for violating our Advertiser Friendly Guidelines, and found some videos to be eligible for limited monetization only.” On being asked about the other videos, the platform said it did not violate their medical misinformation policy.
The other videos shared with the platforms included claims made by Australian self-proclaimed naturopath Barbara O’Neill. In 2019, the New South Wales Health Care Complaints Commission (HCCC) banned her and concluded that she “poses a risk to the health or safety of members of the public.”
Commenting on the prevalence of health misinformation, Yesha Tshering Paul and Amrita Sengupta, researchers at the Centre for internet Society (CIS), a research body on internet and digital technologies in India, said that there are several factors that have contributed towards the spread of medical misinformation on the platform.
“The platform uses a combination of human (including community and trusted flaggers) and automated content moderation, but manual review is time-consuming and expensive (and hence deprioritized), and automated filters are often easy to bypass or manipulate. Creators may couch their claims in language that sounds credible or falsely claim to be medical professionals or utilize misleading titles and thumbnails,” Sengupta and Paul said.
A quick search on YouTube with the keywords “cure for cancer” revealed the extent of this problem. We came across several videos where the suggested cancer treatment was not scientific, but the language and terminology were easy to mislead.
Screenshot of a video claiming how certain food choices may help cancer. (Source: YouTube)
Dr. Prerna Juneja, assistant professor at Seattle University, says that it is not impossible for algorithms to detect and remove such misinformation. “Misinformation can be subtle and nuanced, making it hard for algorithms to detect accurately. That being said, platforms have created really good algorithms for spam, CSAM content, and vaccine-related misinformation, so detecting and removing such videos is not impossible.”
Emphasizing on prioritizing authoritative sources in search results and recommendations, Juneja quipped, “This can help ensure that when users search for health-related topics, they are more likely to encounter credible information from trusted sources. Implementing information panels or alerts next to videos that discuss health topics and linking to verified sources like WHO [World Health Organisation] or CDC [Centers for Disease Control and Prevention] can help provide viewers with context and correct information.”
In 2021, YouTube launched a team to build relationships with medical groups to add videos with reliable health information to the platform. It also added a personal stories shelf to surface relevant personal videos of people discussing health conditions. To begin with, the platform surfaced videos on cancer, anxiety, and depression. The platform also took necessary steps amid the COVID-19 pandemic to control the spread of misinformation, which included attaching warning labels, removing content, banning user accounts, and directing users to authentic resources.
Despite these efforts, videos with misleading and even harmful narratives are aplenty on the platform.
A recent video uploaded on a channel with over a million followers on March 29, 2024, seemingly promotes the use of apricot seeds as an alternative treatment for cancer. While the video doesn’t make any sweeping statements, it projects the potential apricot seeds can have in treating cancer. However, this is an oft-repeated claim that has been debunked several times over the years.
Screenshot of the video talking about apricot kernels. (Source: YouTube)
Studies have also highlighted that social media platforms are a fertile ground for cancer misinformation. A paper by Dr. Skyler Johnson and Dr. Briony Swire-Thompson published in science journal Current Opinion in Psychology, a platform for publishing peer-reviewed studies, argued that while cancer might appear to be a niche subject, it is an important topic given its importance for society. They discussed the prevalence of cancer-related misinformation online and the potential harm it may cause.
Stressing on the complex nature of cancer-related misinformation, Dr. Johnson, assistant professor at the University of Utah Huntsman Cancer Institute and an author of this same paper, told Logically Facts, “Much of the misinformation we have studied exists along a complex spectrum that can be false, mostly false, a mixed true and false, mostly true or true. It can be really difficult for non-experts to evaluate those articles that are mixed or mostly true. They can evade moderation just as many alternative practitioners do by obfuscating truth with fiction.”
He added, “They [YouTube] likely are overwhelmed by the number of posts that contain misinformation. It's likely impossible to moderate and remove all of this content. It's likely that they will be focusing on the content that is most harmful and has the greatest reach (views).”
We contacted two families and an individual who resorted to alternative treatments. A cancer patient's daughter told us that while her family initially followed the doctor's advice and started chemotherapy, they resorted to alternative treatment as her mother's condition deteriorated.
The alternative treatment lasted nearly six months, but there was no sign of improvement. The daughter said that her mother underwent radiotherapy, but all in vain. She told us that in an attempt to save her mother, she gave her carrot juice, something she had read about online, but that led to stomach problems. "I would never advise anyone to resort to online treatments as every cancer is different," she said. Her mother lost her battle with cancer in June 2017.
Prof. Hassan Vally, epidemiologist at Deakin University, opined that one of the reasons why alternative treatments are so popular is because people often believe that ‘natural’ products are safer than pharmaceutical products. “Natural medicines also align with the holistic health philosophy, which emphasizes treating the whole person rather than the disease,” he added.
Dr. Johnson, on the other hand, highlighted fear and financial constraints as a reason. “It's novel, unique, and noteworthy. It keys into patients' misconceptions surrounding their personal desires to have autonomy and maintain a sense of control. It plays on illogical fears surrounding conventional cancer therapies like surgery and chemotherapy. There are also many other factors at play, including perverse financial incentives and unscrupulous purveyors of those that spread the misinformation,” he said.
A research study examined factors that might be responsible for people believing in health misinformation. It highlighted that “susceptibility to health misinformation is driven by multiple psychological processes”. Therefore, combating health misinformation requires an understanding of why people believe in misinformation and who is most susceptible.
Responding to the question on what can be done to improve the online information space, Dr. Thompson said, “There are many interventions that could help: scientific literacy, media literacy, health literacy, corrective information. The difficulty is getting these interventions to the people who need them most. Furthermore, having platforms remove misinformation is likely to have a much bigger impact.”
(Update: The story has been updated with a new response from YouTube.)
(Editor’s Note: We have refrained from adding active or archive links to the YouTube videos in an attempt to not amplify them further. )