Is It Even Possible to Dam the Flow of Misleading Content Online?
As a polarizing US presidential election nears, moderating controversial content on social media poses a pressing problem for tech giants.But no matter how many employees they hire, lines of code they write, or new content policies they implement, major platforms face an overwhelming task. In the second quarter of 2023 alone, for example, Meta, Facebook’s parent company, took action on 13.6 million pieces of terrorism-related content and 1.1 million posts…