By Chinua Albert Okafor @The_RoboRai
Meta, Facebook's new alias, announced today that it will impose stricter policies against misinformation about vaccines targeted at kids (via Engadget) as the FDA recently approved Pfizer's COVID-19 vaccine for children aged five to 11.
COVID-19 vaccine misinformation was restricted on the platform in late 2020, but there were no policies specific to kids.
CDC and WHO are working together to remove harmful content related to children and the COVID-19 vaccine, Meta says in a new blog post. All posts implying the COVID-19 vaccine is unsafe, untested, or ineffective for children are included. Meta will also remind users in English and Spanish that the vaccine has been approved for kids, along with where to find it.
Since the start of the pandemic, Meta has taken down a total of 20 million pieces of COVID-19 and vaccine misinformation from both Facebook and Instagram. It's interesting to note that these numbers contrast with what we've seen from Facebook's leaked internal documents, which showed how unprepared the platform was for misinformation regarding the COVID-19 vaccine. In the event that Facebook had been more prepared, it could have implemented campaigns to combat misinformation earlier in the pandemic, removing more false content as a result.