One rather glaring problem with YouTube’s plan is that it seems ill-equipped to handle conspiracies that crop up rapidly in the wake of major news events such as the Parkland shooting. Those memes—such as the one smearing Parkland survivor David Hogg as a “crisis actor”—are liable to make the rounds on YouTube well before they’ve been authoritatively debunked on Wikipedia, let alone added to a master list of well-known conspiracies. Even with human moderators on hand, some of these conspiracy theories have scaled to the top of YouTube’s prominent list of trending videos, and earned millions of views in the process, before being taken down.
Another issue: Wikipedia is vulnerable to trolls and propagandists, just as is any other platform that relies on the public to produce and curate content. It might become more so, now that people know Wikipedia holds the keys to YouTube’s conspiracy-debunking apparatus, such as it is. One prominent Wikipedia editor was quick to warn that YouTube’s misinformation problems run deeper than anything that could be solved by an “irregularly updated *encyclopedia*.”