WHEN the Collins Dictionary anointed “fake news” its 2017 word of the year, it was presumably because the publishers deemed the term so important it didn’t really matter that it wasn’t actually a single word.
Misinformation is hardly new, but the platforms it spreads on these days certainly are. Google, Facebook and friends provide an unprecedented capacity for the misleading, or downright dishonest, to be disseminated and amplified.
In April 2018, Facebook CEO Mark Zuckerberg acknowledged to the US Congress that his organisation had, among other things, been “too slow to spot and respond to Russian interference” in the 2016 US presidential election using the social media site.
The company, which has also faced allegations of antiright wing bias, has made various attempts to crack down on fake news since 2016.
Earlier this month, it advertised for two “news credibility specialists”, people with “a passion for journalism, who believe in Facebook’s mission of making the world more connected”.
Within a day, following reports in other media, the ad had been replaced with one for two “news publisher specialists” and the passion for journalism had gone.
Whatever the job title, it’s hard to see what two people could do to curb the unruly behemoth that is the internet.
And would it be a good thing if they could? Do we want private corporations such as Google, Facebook and Co to become the effective “censors-in-chief” of our networked age?
Facebook, it’s worth remembering, is the platform that for many years refused to allow women to post pictures of their breastfeeding babies because this breached their nudity provisions.
That said, there’s little doubt about the potential and actual harms of the proliferation of fake news, for health as much as for politics.
Fake health news online covers a gamut from AIDS-denialism to breathless announcements of miracle cures for cancer.
A recent article in Undark magazine – which I, ironically enough, first saw on Facebook – outlines some of the risks posed by this kind of misinformation.
Indian engineer Biswaroop Roy Chowdhury, for example, received 380 000 views within weeks for a video he posted on YouTube arguing that HIV did not exist and that antiretrovirals were the real cause of AIDS.
When questioned by Undark, Chowdhury claimed that 700 people had been in touch to tell him they had stopped their HIV medication as a result.
In a similar vein, the Independent last year examined the proliferation of quackery in health news shared on Facebook.
The most popular cancer story on the social networking site over the previous year was one claiming dandelion root cured cancer “better than chemotherapy”.
The article had received more than 1.4 million Facebook likes, shares or comments, despite there being no actual evidence to support the root’s efficacy as a cancer treatment.
Perhaps the solutions might lie, not in censorship per se, but in recognising the ways online platforms differ fundamentally from the media outlets they are replacing.
Traditional media outlets provided a curated view of what was happening in the world to a broad sector of the population. The system was by no means perfect but it did at least mean people across society were able to form their views based on pretty much the same information.
In today’s more fragmented world, each of us can now inhabit our own personal “filter bubble”, exposed only to information that confirms our pre-existing beliefs.
If you frequent antivaccination sites online, a Google search for “vaccine safety” will prioritise news from those sites rather than, say, the MJA or the Cochrane Collaboration.
If your Facebook friends believe doctors are the tools of Big Pharma, the news items you see on the site will be more likely to come from Natural News than the New York Times.
The online platforms have an interest in skewing what you see in this way. It’s how they make their money.
The more content is tailored to your particular interests and beliefs, the more valuable the adjacent space is to the advertisers who provide the sites’ revenue.
If we really want to undermine the spread of fake news, we’re going to need to attack the algorithms that so effectively promote it.
It’s hard to see Google and Facebook getting on board with that.
Jane McCredie is a health and science writer and editor based in Sydney.
To find a doctor, or a job, to use GP Desktop and Doctors Health, book and track your CPD, and buy textbooks and guidelines, visit doctorportal.
Misinformation is hardly new, but the platforms it spreads on these days certainly are. Google, Facebook and friends provide an unprecedented capacity for the misleading, or downright dishonest, to be disseminated and amplified.
In April 2018, Facebook CEO Mark Zuckerberg acknowledged to the US Congress that his organisation had, among other things, been “too slow to spot and respond to Russian interference” in the 2016 US presidential election using the social media site.
The company, which has also faced allegations of antiright wing bias, has made various attempts to crack down on fake news since 2016.
Earlier this month, it advertised for two “news credibility specialists”, people with “a passion for journalism, who believe in Facebook’s mission of making the world more connected”.
Within a day, following reports in other media, the ad had been replaced with one for two “news publisher specialists” and the passion for journalism had gone.
Whatever the job title, it’s hard to see what two people could do to curb the unruly behemoth that is the internet.
And would it be a good thing if they could? Do we want private corporations such as Google, Facebook and Co to become the effective “censors-in-chief” of our networked age?
Facebook, it’s worth remembering, is the platform that for many years refused to allow women to post pictures of their breastfeeding babies because this breached their nudity provisions.
That said, there’s little doubt about the potential and actual harms of the proliferation of fake news, for health as much as for politics.
Fake health news online covers a gamut from AIDS-denialism to breathless announcements of miracle cures for cancer.
A recent article in Undark magazine – which I, ironically enough, first saw on Facebook – outlines some of the risks posed by this kind of misinformation.
Indian engineer Biswaroop Roy Chowdhury, for example, received 380 000 views within weeks for a video he posted on YouTube arguing that HIV did not exist and that antiretrovirals were the real cause of AIDS.
When questioned by Undark, Chowdhury claimed that 700 people had been in touch to tell him they had stopped their HIV medication as a result.
In a similar vein, the Independent last year examined the proliferation of quackery in health news shared on Facebook.
The most popular cancer story on the social networking site over the previous year was one claiming dandelion root cured cancer “better than chemotherapy”.
The article had received more than 1.4 million Facebook likes, shares or comments, despite there being no actual evidence to support the root’s efficacy as a cancer treatment.
Perhaps the solutions might lie, not in censorship per se, but in recognising the ways online platforms differ fundamentally from the media outlets they are replacing.
Traditional media outlets provided a curated view of what was happening in the world to a broad sector of the population. The system was by no means perfect but it did at least mean people across society were able to form their views based on pretty much the same information.
In today’s more fragmented world, each of us can now inhabit our own personal “filter bubble”, exposed only to information that confirms our pre-existing beliefs.
If you frequent antivaccination sites online, a Google search for “vaccine safety” will prioritise news from those sites rather than, say, the MJA or the Cochrane Collaboration.
If your Facebook friends believe doctors are the tools of Big Pharma, the news items you see on the site will be more likely to come from Natural News than the New York Times.
The online platforms have an interest in skewing what you see in this way. It’s how they make their money.
The more content is tailored to your particular interests and beliefs, the more valuable the adjacent space is to the advertisers who provide the sites’ revenue.
If we really want to undermine the spread of fake news, we’re going to need to attack the algorithms that so effectively promote it.
It’s hard to see Google and Facebook getting on board with that.
Jane McCredie is a health and science writer and editor based in Sydney.
To find a doctor, or a job, to use GP Desktop and Doctors Health, book and track your CPD, and buy textbooks and guidelines, visit doctorportal.
Loading comments…
More from this week
Health policy
18 May 2026
Budget 2026-27 analysis: urgent care, public dental, Aboriginal and Torres Strait Islander health
Indigenous health
18 May 2026
Trachoma successfully eliminated in Australia through Indigenous-led health care
Newsletters
Subscribe to the InSight+ newsletter
Immediate and free access to the latest articles
No spam, you can unsubscribe anytime you want.
By providing your information, you agree to our Access Terms and our Privacy Policy. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.