April 27, 2024

‘What The Hell Were You Thinking?’: Ted Cruz Confronts Mark Zuckerberg On Instagram Reportedly ‘Helping Pedophiles’

‘What The Hell Were You Thinking?’: Ted Cruz Confronts Mark Zuckerberg On Instagram Reportedly ‘Helping Pedophiles’

Ted Cruz grilled Mark Zuckerberg during a hearing on Wednesday about Instagram allegedly assisting pedophiles in accessing inappropriate child sexual content.

Republican Texas Sen. Ted Cruz grilled Meta CEO Mark Zuckerberg during a hearing on Wednesday about Instagram allegedly assisting pedophiles in accessing inappropriate child sexual content.

Cruz referred to investigative reporting by The Wall Street Journal exposing that numerous pedophiles have exploited Instagram’s algorithms and networking features for predatory activities during the Senate Judiciary Committee hearing on “Big Tech and the Online Child Sexual Exploitation Crisis.” Cruz also presented a printed Instagram warning screen that provides an option for people searching for child abuse material to view the content on the platform and pressed the Meta CEO about it.

“What was particularly concerning about The Wall Street Journal expose was the degree to which Instagram’s own algorithm was promoting the discoverability of victims for pedophiles seeking child abuse material,” Cruz asserted. “In other words, this material wasn’t just living on the dark corners of Instagram. Instagram was helping pedophiles find it by promoting graphic hashtags.”

Cruz then showed the Instagram warning screen.

“These results may contain images of child sexual abuse,” it says. It then gives users the option to either “get resources” or “see results anyway.”

“Mr. Zuckerberg, what the hell were you thinking?” Cruz asked about the warning screen.

Zuckerberg responded that research shows it is beneficial to give these people the option to obtain help.

“I understand ‘get resources,’” Cruz responded. “In what sane universe is there a link to ‘see results anyway’?”

“Well, because we might be wrong,” Zuckerberg responded.

Meta told the WSJ it was exploring “ways to actively defend against this behavior” in a statement responding to the reporting. The company established an internal child safety task force in June after the reporting but it still has had difficulties resolving the issue, according to the WSJ.

More at:



Senator Ted Cruz Grills Mark Zuckerberg On FaceBook’s Failure To Protect Children From Exploitation

No Title

No Description


| Opinion

Social media CEOs won’t stop their products from harming kids until they’re forced to

Social media CEOs won’t stop their products from harming kids until they’re forced to

CEOs of social-media companies gave testimony before the Senate Judiciary Committee on what they’re doing to prevent harm to children who use their sites.

Meta CEO Mark Zuckerberg wanted to sound sincere when he stood up and apologized Wednesday to a crowd of parents whose kids died after experiencing sexual extortion or harassment online.

But his words were empty when you consider how little he did to fight the bullying, pornography, harassment and filth peddled by Facebook and Instagram, and the damage that it’s done to generations of children.

Or how Zuckerberg, in 2021 emails, refused to expand the company’s child safety-and-well-being workforce.

Yesterday’s Senate Judiciary Committee hearing on online child sex abuse was a parade of lowlights from Big Tech.

CEOs of TikTok and Meta bragged about “how much” their companies are doing to prevent harm to their underage users. Which boiled down to platitudes, not actual action.

TikTok pledged to spend $2 billion to enhance “safety” on the site; Zuck bragged that 40,000 employees are working on safety and security and his company spent $5 billion on safety efforts last year.

Impressive to most people if you take these figures on their own, as was no doubt intended by their PR handlers.

But not so impressive if you put them in their true context.

TikTok’s parent ByteDance pulled in $24.5 billion in revenue in just the first quarter of 2023, while Meta made more than $28 billion.

So the “child safety” spending is barely a drop in the bucket, even just compared to what they spend on tweaking their algorithms to boost the addictive factors.

Plus: Meta laid off employees from its safety teams at both Instagram and Facebook a year ago.

And of course the American version of TikTok is far more child-toxic than the one in ByteDance’s homeland, China.

Post reporting has shown how teens who start an account are immediately guided to bigoted, sexual and violent content.

These products are designed, and endlessly re-designed, to be addictive; the entire business model is built on “grabbing eyeballs” to sell to advertisers.

So what if kids and teens are especially susceptible to that addiction? .

Pew Research Center found that 1 in 5 US teens aged 13 to 17 are on YouTube and TikTok “almost constantly.”

Some 47% are daily users of Meta’s Instagram and 19% visit Meta’s Facebook daily.

According to Common Sense Media, approximately 38% of children between the ages 8 to 12 were using social media in 2021.

As for “safeguards”: Most sites may have a minimum age of 13 to join, but verification is practically nonexistent, so younger children easily gain access all the time by simply “certifying” that they’re old enough.

That’s only “safety” for the company — from lawsuits.

As for the impact: Social-media sites are bottomless cesspools of content that drive anxiety, depression and body-image issues in America’s youth.

The National Institute of Health links teen social-media use it to problems with “sleep, addiction, anxiety, sex related issues, behavioral problems, body image, physical activity, online grooming, sight, headache and dental caries.”

If any other product was causing such harm to children, it’d get yanked off the market or at least regulated heavily.

Big Tobacco played dumb over the impact of its products for decades, even while maximizing addictive properties, too.

How are TikTok and Meta any better?

Actually, they’re worse: They deliver their dopamine-rushes for free.

We’re not often fans of government regulation, but social-media giants have shown again and again they can’t be trusted.

They’ll choose the path of the most profit, no matter the impact on mental health or society.

A hearing is not enough.

Congress and the administration need to get serious about actual penalties for bad behavior.

Start by removing the legal protections they get where they can pretend they are not responsible for what appears on their platforms.

And then serious enforcement, with serious penalties, for violations.

Sorry is not enough.

More at:


Share the News