TikTok mistakenly posted a link to an internal version of its new AI digital avatar tool without guardrails, allowing users to create videos that could say virtually anything. This error, first identified by CNN, enabled the creation of videos with quotes from Hitler and messages encouraging harmful actions like drinking bleach. TikTok has since removed this version, while the intended version remains available.
Launched earlier this week, TikTok’s Symphony Digital Avatars allow businesses to generate ads using the likeness of paid actors. This tool uses AI-powered dubbing to make avatars say scripts within TikTok’s guidelines. Although only users with a TikTok Ads Manager account are supposed to access this tool, the internal version found by CNN allowed anyone with a personal account to use it.
Technical Error and Consequences
TikTok spokesperson Laura Perez stated that the "technical error" has been resolved, and only a very small number of users could create content using this internal version for a few days. CNN used the tool to generate videos with content like Osama bin Laden’s “Letter to America,” a white supremacy slogan, and misinformation about voting dates. These videos lacked a watermark indicating they were AI-generated, unlike the proper version of the Symphony Digital Avatars.
Although CNN did not post the videos to TikTok, Perez noted that such content would have been rejected for policy violations. Despite the removal of the flawed version, there are concerns about potential misuse of the digital avatar creator and whether TikTok is prepared to handle such issues.