IE 11 is not supported. For an optimal experience visit our site on another browser.

On Facebook, anti-vaxxers urged a mom not to give her son Tamiflu. He later died.

Online groups that routinely traffic in anti-vaccination propaganda have become a resource for people seeking out a wide variety of medical information.
Get more newsLiveon

Facebook groups that routinely traffic in anti-vaccination propaganda have become a resource for people seeking out a wide variety of medical information — including about the ongoing flu season.

Facebook hosts a vast network of groups that trade in false health information. On “Stop Mandatory Vaccination,” one of the largest known health misinformation groups with more than 178,000 members, people have solicited advice for how to deal with the flu. Members of the group have previously spread conspiracies that outbreaks of preventable diseases are “hoaxes” perpetrated by the government, and use the groups to mass-contact parents whose children have died and suggest without evidence that vaccines may be to blame.

One recent post came from the mother of a 4-year-old Colorado boy who died from the flu this week. In it, she consulted group members while noting that she had declined to fill a prescription written by a doctor.

The child had not been diagnosed yet, but he was running a fever and had a seizure, the mother wrote. She added that two of her four children had been diagnosed with the flu and that the doctor had prescribed the antiviral Tamiflu for everyone in the household.

“The doc prescribed tamiflu I did not pick it up,” she wrote.

Tamiflu is the most common antiviral medication prescribed to treat the flu. The drug can ease symptoms and shorten the length of illness, but concerns about side effects are common even outside anti-vaccination echo chambers. The flu has hit children particularly hard this season. Pediatric hospitalization rates are higher than normal, and 68 children have died, according to the Centers for Disease Control and Prevention.

NBC News verified the posts by cross-referencing them with a fundraising page set up by the family, along with published news reports quoting the family.

The posts highlight how Facebook groups dedicated to health misinformation such as vaccinations can also be used to solicit and share potentially dangerous medical advice. A study by the American Academy of Family Physicians found that 59 percent of parents said their child had missed the flu shot at least once due to “misinformation or misunderstanding.”

None of the 45 comments on the mother’s Facebook post suggested medical attention. The child was eventually hospitalized and died four days later, according to a GoFundMe started on his behalf by his family.

The mother also wrote that the “natural cures” she was treating all four of her children with — including peppermint oil, Vitamin C and lavender — were not working and asked the group for more advice. The advice that came in the comments included breastmilk, thyme and elderberry, none of which are medically recommended treatments for the flu.

“Perfect, I’ll try that,” the mother responded.

Download the NBC News app for breaking news and politics

The mother’s recent posts have now been deleted from Stop Mandatory Vaccination, but in group posts going back to 2017 she said she had not vaccinated her children from the flu.

The mother did not respond to a request for comment.

A Facebook spokesperson said in an emailed statement: “This is a tragedy and our thoughts are with his family and loved ones. We don’t want vaccine misinformation on Facebook, which is why we’re working hard to reduce it everywhere on the platform, including in private groups.”

In an emailed statement, the Colorado Department of Public Health and Environment confirmed that the preschooler had died from the flu and said it did not have records showing whether the child was vaccinated.

“While flu is circulating, it is not too late to get a flu shot, and we recommend everyone ages six months and older who has not had the yearly vaccine get it,” the department said.

Over the last year — amid nationwide concern over vaccine hesitancy and the worst measles outbreak in decades — Facebook has taken steps to limit the volume and reach of groups that spread anti-vaccine content.

Following similar decisions by Pinterest and YouTube, Facebook announced in March that it would limit the reach of anti-vaccination content, no longer serve up anti-vaccination groups and pages in search results and the recommendations bar, and no longer allow users and groups that spread vaccine misinformation to place ads or run fundraisers. In September, Facebook rolled out pop-up warnings for users searching for vaccine-related content.

But Facebook has stopped short of banning the anti-vaccine groups themselves, citing an unease with being the arbiter of truth.

Facebook groups are a hotbed of vaccine misinformation and content, said Kolina Koltai, a researcher at the University of Texas at Austin, who has studied the social media behavior of the anti-vaccination movement since 2015. Koltai said she’s seen similar posts in which women have reported that their children were sick with measles or cancer and received medically questionable advice.

“These communities have become a haven or resource for parents and for women to connect with others and ask for help,” Koltai said.

One of the biggest purposes of these groups is as a main information exchange hub. And when these groups are recommending potentially medically unsound advice, it can have a severely negative consequence.

“This is what we warn about,” Koltai said.

CORRECTION (Feb. 7, 2020, 1:54 p.m. ET): An earlier version of this article misstated the number of members that the Facebook group "Stop Mandatory Vaccination" has. It has 178,000, not 139,000.