An anonymous reader quotes the Huffington Post:
Six months ago, under tremendous public pressure, YouTube announced that it would tweak its algorithm to recommend fewer videos “that could misinform users in harmful ways.” It was a major step for a company that has spent years driving people toward increasingly sensationalist content — including dangerous disinformation — that would keep viewers glued to their screens for as long as possible to maximize advertising revenue. The announcement in late January triggered panic within YouTube’s sprawling network of conspiracy theorists. [But] the audience for YouTube’s top conspiracy theory channels is still growing, a HuffPost investigation has found… Some channels are growing at slower rates than before, others at around the same rates or a bit more rapidly… [A]ll are still drawing in new viewers — and the creators behind them remain undeterred.
There are significant financial incentives for conspiracy theorists to keep churning out clickbait disinformation on YouTube: They can still promote their merchandise and third-party fundraising pages on their videos, and they can still take a cut of the earnings from ads on their content through YouTube’s monetization program. The payoff can be huge.
Views from video recommendations, which can be especially vital for new YouTube pages trying to develop audiences, have been cut in half for content featuring harmful misinformation, a YouTube spokesperson told HuffPost. But for massive conspiracy theory channels — channels that YouTube’s algorithm has already catapulted into notoriety, giving them large and loyal followings — the change has been largely ineffective in suppressing their influence…. YouTube acted “way too late,” said former Google engineer Guillaume Chaslot, who helped design YouTube’s algorithm. “The harm that’s been done in many cases can’t now be undone.”