TikTok is becoming swamped with AI-generated conspiracy theory content thanks to a new financial incentive program for creators
Short-form video content creators have an interesting challenge to overcome: In a never-ending feed of videos, how do you grab someone’s attention long enough to keep them watching yours? Increasingly, the answer appears to be to simply make something up, and the more salacious, conspiratorial, or outrageous, the better. Add in some AI-generated imagery, subtitles, and an AI voice-over, and thanks to some creator incentive programs, big bucks may follow.
Media Matters has released a YouTube video featuring its senior video producer Abbie Richards performing a deep-dive on exactly why conspiracy theory videos are popping up all over platforms like TikTok and YouTube Shorts, and just how easy they are to create thanks to, you guessed it, the power of AI.
Endeavouring to create its very own short-form conspiracy content, based on the idea that a scientist eating Play-Doh in a lab recorded themselves growing in height, losing 60 lbs in weight, and curing cancer, the channel made use of several AI tools and followed the instructions of a YouTube “entrepreneur guru” to create a ludicrously over-the-top subtitled conspiracy video that looks all too familiar.
In the case of TikTok, blame for this glut of dubious and misinformed content is placed squarely at the feet of its recently launched “Creativity Program Beta”, which pays creators for their content based on how many views their videos receive.
However, monetisation is only activated on videos over 60 seconds long, with users viewing the video for over five seconds. This financially incentivises creators to make videos with a powerful hook to draw viewers in (in this case, a completely outrageous claim or proposition) and then pad out the content as much as possible to keep them watching.
Thanks to the proliferation of AI image generation tools and AI voice effects, a simple conspiracy script of entirely made-up information and some flashy editing of the results is all that’s required to make a video with the potential to make serious money for its creators if it garners enough views. While TikTok itself does not provide exact figures on how much revenue is created through its program, it’s estimated that qualifying videos could earn their creators anywhere between $4 to $8 for every 1000 views.
While TikTok might be the primary source for this content, anyone who’s had a casual glance through their YouTube shorts recently may have seen videos that look very similar. Video creators often spread their content over multiple platforms to maximise potential viewers, and YouTube Shorts also has its very own ad-based monetisation program.
It’s not just video content creators that are leveraging an AI helping hand to maximise user engagement. Several gambling firms have also been reported to be using AI to optimise their content to “help customers make informed decisions” and “prioritise relevant content for users based on their past activity”, a worrying development given that gambling addiction rates appear to be skyrocketing in multiple countries.
Ultimately the responsibility falls to the companies platforming this media to put in policies and controls that limit the spread of conspiratorial and/or potentially harmful AI-generated or assisted content from drowning out user-created efforts, and to stop the proliferation of misinformation or dangerous practices.
However, when large amounts of financial incentives are involved, it seems unlikely that companies and creators keen to maximise their profits will put in the required amount of effort to stem the tide of AI-generated and assisted media that pulls in the big bucks, even if it does contribute to dangerous societal harms.