UpScrolled’s hate speech problem reveals moderation breakdown after viral growth
Hate content spreads as UpScrolled struggles to keep up with user growth
UpScrolled, a fast-growing TikTok alternative, is learning the hard way that rapid user growth without robust safeguards can backfire. Despite surpassing 4 million downloads within eight months of launching, the social platform is now under scrutiny for allowing usernames, hashtags, and content that openly promote racial slurs and extremist rhetoric.
With parallels to other viral apps like Bluesky, which also faced slur-related user backlash, UpScrolled’s trajectory highlights the hidden costs of viral traction: infrastructure gaps, brand safety risks, and the reputational threat of unmoderated environments.
This article explores the unfolding moderation failure at UpScrolled, what the company is doing to address it, and why the situation raises red flags for brands, creators, and marketers investing in emerging platforms.
Short on time?
Here’s a table of contents for quick access:
- What happened with hate speech on UpScrolled
- Why content moderation matters now more than ever
- What marketers should know

What happened with hate speech on UpScrolled
After surging past 2.5 million users in January 2026, UpScrolled found itself flooded with troubling content: racial slurs embedded in usernames, hashtags referencing extremist ideologies, and unmoderated videos glorifying figures like Hitler.
Investigative reporting by TechCrunch confirmed that despite user reports and screenshots submitted to the platform’s team, many accounts featuring offensive usernames and content remained active days later. The Anti-Defamation League (ADL) added fuel to the fire, publishing its own findings showing the app was harboring antisemitic and extremist content — including from organizations designated as terrorist groups.
UpScrolled, founded in 2025, responded by promising improvements. In a public video, CEO Issam Hijazi acknowledged the issue and pledged to scale up the content moderation team and improve technical systems. “We are offering everyone the freedom to express and share their opinions in a healthy and respectful digital environment,” he stated.
What does it take to build a safer social platform? @iHijazi breaks it down. #UpScrolled #UserSafety #FreeSpeech pic.twitter.com/cTUx5wwawR
— UpScrolled (@realUpScrolled) February 10, 2026
However, the gap between UpScrolled’s community guidelines — which ban hate speech and harassment — and what’s actually appearing on the app suggests enforcement has not kept pace with growth.
Why content moderation matters now more than ever
The UpScrolled saga is a repeat of a familiar pattern: a social platform goes viral, draws a flood of new users, and gets blindsided by the demands of real-time moderation at scale.
Even larger platforms like Meta and X (formerly Twitter) struggle with this, but early-stage startups are particularly vulnerable. Without automated detection systems, multilingual moderation support, and clear escalation processes, it’s nearly impossible to manage abuse in a fast-growing user base.
For marketers, the implications go beyond bad PR. A platform’s inability to control harmful content can put brand safety at risk — whether through ad adjacency or backlash from values-conscious customers.
Platforms that promise “freedom of speech” often walk a tightrope between user expression and harmful amplification. And in UpScrolled’s case, its hands-off FAQ stance (“we don’t censor opinions”) seems to have left the door open to coordinated abuse.
This tension becomes especially dangerous when a platform becomes a magnet for fringe or extremist groups seeking less-moderated digital spaces.
What marketers should know
If your brand or client is experimenting with emerging platforms like UpScrolled, here’s what you need to weigh right now:
- Vet moderation policies early: Don’t rely on vague commitments. Ask how the platform enforces its rules, what tools are in place, and how quickly it acts on abuse.
- Watch platform sentiment: Public trust in a social app can nosedive quickly. Stay on top of user reports, watchdog publications, and creator sentiment.
- Consider platform risk as part of your media mix: Don’t over-index on new apps that haven’t proven their safety standards. Build redundancy into your distribution strategy.
- Engage your legal and compliance teams: If you’re working with UGC platforms that lack rigorous moderation, you may need additional legal review for sponsored content.
As UpScrolled scrambles to scale its moderation, its users — and by extension, marketers — are left in limbo. Until enforcement improves, brand engagement on the app comes with a real reputational tradeoff.

