Meta refuses to come to the table.
This company has failed to address numerous problems with their platform design, equivocating with vague press releases and insufficient feature updates. Stakeholders like us call for content feeds that are safe for young users—free of violence, sextortion, harmful trends—and all we get in return are handwavey claims that protecting kids online is a priority.

Enough is enough, we decided. We marched our way down to Meta’s front door.
Standing together in Astor Place, the morning sun beamed down on us. Our voices picked up as the spring day took shape; they carried clearly along the block. We raised our signs, lined the platform stage with banners. A brave group of speakers took turns at the podium, rallying us to get moving.
A choir sang between remarks from parent survivors who laid the hard truth bare: their children’s lives were lost because of Meta’s platforms.
Youth leaders joined in to support, including both Design It for Us co-chairs. Arielle Geismar took us back seven years to her NYC public school days, where algorithmic systems forced eating disorders upon her and drove her best friend to suicide. Meta’s dark patterns, she says, “are the epicenter of our pain.”
But “we are not going anywhere”; no, this is just the start, as Zamaan Qureshi emphasized too: “Zuckerberg, you had your chance. But our generation isn’t waiting for you anymore.”

Our procession turned the corner to 770 Broadway, home to Meta’s offices. By the main entrance, more parents spoke, remembering their loved ones over bouquets and framed pictures. The rest of us circled around the building in support. I gripped my banner a little tighter as my friend leaned in, took a deep breath, and said, “We’re really doing this.”
“Finally,” was all I could get out. I looked down at the message between us as we stood across the street for all eyes to see: Meta Profits. We Pay the Price.1

On April 24, 2025, Design It for Us co-organized a protest with Heat Initiative, ParentsTogether, and other responsible tech advocates from around the country. We hand-delivered this letter to Meta’s NYC headquarters; with over 10k signatures (you can still add your name!), it calls for three basic remedies:
an end to the algorithmic promotion of dangerous content to users under 18 years old,
measures to block predatory adults from reaching kids, and
more robust and reliable reporting procedures. These, we maintain, are bare-minimum asks, all rooted in the testimonies of families present, the stories of youth who have died from social media causes.
Spoiler alert—as I write this, Meta’s response to our action is merely deflection, saying they’ve already addressed “parents’ top concerns.”
But something is in the air these days. Our Movement is alive in the public consciousness, flowing through internet discourse, serious research, and major streaming hits. Whistleblower books are flying off the shelves, and Gen-Z activists are featured on prime time with increasing regularity.
Efforts like this protest are more necessary than ever—to further charge the atmosphere bit by bit, to keep bending the momentum in our favor. This is more than a little meetup to let some steam out. It’s how we stay united and grow stronger together, to hold Big Tech accountable.
Countless stopped us on the streets of downtown Manhattan, asking how they could help; many insisted on doing more after they signed our letter. At one point, a couple stopped their car next to me. The woman in the passenger seat rolled her window down; the two of them gave me the firmest thumbs-ups I’ve ever seen and drove on.
We are gaining real ground on Meta. Our collective shouts for change boost us along the path to “build a future where children are protected,” as we chanted all morning.
The protest proceeded peacefully, but I couldn’t help but consider what might happen afterwards, though, given Meta’s track record.
In just the past few months, Meta has run extensive ad campaigns to posture on the topic of the kids’ online safety. It’s lobbying in a gilded disguise: e.g., full-page placements in The New York Times and screens across D.C.—content claiming the company would do anything to protect children.

Mounting evidence tells a different story: Meta employees have long been aware of industrial-scale danger on their platforms, but they choose not to invest wholeheartedly in removing it.
The company was just on trial against the FTC for weeks. The agency argues that the consolidation of Facebook, Instagram, and WhatsApp has monopolized personal social networking, violating U.S. antitrust principles. Scathing details are emerging in discovery; one observation made herein is that Meta’s exorbitant profits grow at the expense of product integrity. As we all know, mindless, addiction-coded scrolling is the very core of Meta’s revenue. The company generally works to maximize the number of ads we swipe past in any given session. To this end, its apps will, for instance, recommend plenty of the same posts to child predators and minors—or even prompt predators to follow children. This enables such connections as a byproduct of engagement—while the company turns as many blind eyes as it can afford to. 2
What a wicked status quo to maintain.

My mind snapped back from these thoughts when a custodian slid outside. With a broom and dustpan, he started sweeping away, awkwardly poking between speakers’ legs as though no one was there. I tensed up a little as I watched him collect invisible debris off the sidewalk. Discomfort loomed as employees shuffled around this scene, through the revolving door in the background. We were physically in the way of people getting to work that morning.
But what is a minor inconvenience next to the impact their corporation has had on us gathered there? Discomfort doesn’t even begin to name what we’ve been through—what all of us have endured for well over a decade.
Coalition member Ava Smithing posed this question to everyone gathered after we walked back to Astor Place. Ava, who leads advocacy and operations at the Young People’s Alliance, brought a sobering reminder that “justice for those we have lost will only come when we restore true life for those who live on.”
So let’s not shy away from the discomfort. Let’s keep raising our voices, keep making people sit with the tense, ugly truths of what social media has become.
Because, to quote Ava once more, “We deserve more. And we deserve it now.”
This is the official response Meta released to news outlets after the protest. It refers to the controversial teen-account settings that were released back in September 2024:
“We know parents are concerned about their teens’ having unsafe or inappropriate experiences online. It’s why we significantly changed the Instagram experience for teens with Teen Accounts, which were designed to address parents’ top concerns …”
In other words, Meta is claiming they already have the three things we demanded in the letter. We disagree.
With the help of our colleagues at Accountable Tech, Design It for Us members subsequently published an experiment to test the merits of Instagram teen accounts. The report details the trajectory of content recommendations over a two-week period, from the moment of account creation onwards. With one hour of activity per day—following mainstream accounts, liking a few posts, etc.—all participants were served explicit content, and supposed defaults like nighttime notifications did not live up to Meta’s promise.
In other words, Meta is claiming they already have the three things we demanded in the letter. We disagree.
With the help of our colleagues at Accountable Tech, Design It for Us members subsequently published an experiment to test the merits of Instagram teen accounts. The report details the trajectory of content recommendations over a two-week period, from the moment of account creation onwards. With one hour of activity per day—following mainstream accounts, liking a few posts, etc.—all participants were served explicit content, and supposed defaults like nighttime notifications did not live up to Meta’s promise.
A Meta spokeswoman dismissed the young people’s voices as “manufactured” and “flawed.” Her response nitpicked fine details of the research; she, for instance, would have preferred a slightly different definition of “sensitive content” in the report.
Alright, but what she didn’t acknowledge is this: teen accounts are a far cry from a safe experience online, as whatever parameters exist in them are demonstrably, appallingly insufficient.
If this is how the discourse will continue, we have a long hill to climb. We need all of you on board, lobbying on the state and federal levels, phone banking, planning more in-person activations, and so forth. So don’t hesitate—reach out to learn more about specific projects and how to support the coalition. We’d love to have you.
Thank you, as always, for your attention. Let’s keep pushing the Movement onwards.
Nick Plante, Organizing Director
A friendly reminder: Meta profits every time you scroll through one of their apps. Companies provide the bank notes, the monetary payments from their ad accounts—but this process can’t exist without our time and attention, our fingers to slide and tap, our eyeballs for ads to be served to in the first place.
Social media is probably your most expensive habit right now: the true price is your mind’s health.
They can afford a lot. Surely they’ve calculated this sort of thing down to every penny.
"Her response nitpicked fine details of the research; she, for instance, would have preferred a slightly different definition of 'sensitive content' in the report.
Alright, but what she didn’t acknowledge is this: teen accounts are a far cry from a safe experience online, as whatever parameters exist in them are demonstrably, appallingly insufficient."
Great point! It is so troubling that Meta seems more concerned with playing semantic games than with addressing the ugly issue at its core.