Social media is killing our teens. Cyber intelligence expert says a ban isn’t enough to save them

Paul Raffile is passionate about making the internet a safer place for our youth, but believes tech giants need to lead the charge.

The cyber intelligence analyst worries that government actions — such as lifting the age of access to social media, as is currently proposed — may absolve organisations of responsibility for ensuring their own platforms are safe.

WATCH THE VIDEO ABOVE: Tech industry given months to create child safety rules.

Know the news with the 7NEWS app: Download today Download today

While most social media platforms currently do not allow access to users aged 13 and under, there have been calls to raise this limit to 16.

How this ban would be enforced is unclear, with various methods including ID verification under consideration.

Raffile doesn’t disagree with increasing the age limit for social media platforms — he believes action should go beyond that, he says in an interview for an upcoming 7NEWS Spotlight episode.

“I think it’s a good start, but I don’t think it’s the only thing that we can do,” Raffile said.

“I think it has to go beyond that.

“It’s almost as if you’re going to an amusement park and there’s that sign that says, ‘You must be this tall to enter this ride’.

“In one instance, we’re protecting the vulnerable (from) something that might be dangerous to them.

“But what happens if that entire roller coaster is just simply dangerous?”

Raffile is a prominent intelligence consultant, and was even briefly set to take on a role at Meta as a ‘human exploitation investigator’.

Cyber intelligence analyst Paul Raffile believes tech giants need to lead the charge in keeping children safe on social media.Cyber intelligence analyst Paul Raffile believes tech giants need to lead the charge in keeping children safe on social media.
Cyber intelligence analyst Paul Raffile believes tech giants need to lead the charge in keeping children safe on social media. Credit: 7NEWS

He claims the job offer was rescinded hours after he publicly criticised Instagram for failing to protect children online.

He did so at a webinar about safeguarding against financial sextortion schemes featuring the families of children who died after being scammed on Instagram.

Meta denies this claim, highlighting its team of approximately 40,000 people dedicated to working on safety and security matters — including staff with backgrounds in child safety, investigations and law enforcement.

Raffile has been outspoken about the need to protect children from predators online, particularly scammers targeting users through sextortion schemes.

These schemes have been exposed by 7NEWS Spotlight in a previous episode, featuring the family of a young Australian boy who took his own life after being targeted by scammers.

These scammers — known as Yahoo Boys — target young social media users, most commonly teenaged boys, and befriend them before convincing them to send through sexually explicit photos.

Generally, they befriend their victims by pretending to be attractive young girls, sometimes using hacked accounts.

Once their victims have sent the explicit photos through, the scammers then demand money to prevent these images from being released.

Raffile noted that many victims of sextortion would not be protected by increasing the age limit for social media platforms.

“What are we going to do about all the 16 and 17-year-olds who are falling victim to sextortion?” he asked.

“I believe there (are) more suicides, as it relates to the scam among 16- and 17-year-olds, than younger teenagers.

“So are we just going to let them fend for themselves?”

Raffile questioned whether raising the age limit took the responsibility away from tech companies to implement changes to ensure platforms were safe for all users.

Changes required

Raffile has been vocal about the changes required to make platforms safer, including immediately making the Friends and Follower lists of all underaged users private.

“These lists are the primary source of leverage in nearly all the financial sextortion scams targeting minors,” Raffile said.

“The moment a teen accepts a scammer’s friend request, their entire social network is exposed.”

Accounts for teens under 16 are automatically set to private, though these lists are still available to their friends.

Scammers also frequently used the same images for their fake identities and Meta already had the technology to detect this imagery and block users using it, Raffile said.

Meta has resisted calls to further police underaged users on its platforms.

It has its own age-verification technology in place, using artificial intelligence to detect when people may be misrepresenting their age.

The organisation believes any further responsibility would be better placed on app stores and operating systems, its head of global safety Antigone Davis previously told the Social Media and Australian Society inquiry.

Davis also last week said she did not believe social media had done harm to children.

“Issues of teen mental health are complex and multifactorial,” Davis said.

Meta offers tools for parents to help manage teen social media use, including daily limits and scheduled breaks set by parents and the ability to see children’s Follow lists and Block lists.

The Instagram app reminds teens to leave the app, and alerts them when they might have been scrolling on the same topic for an extended period of time.

The app also provides links to relevant organisations when someone searches for, or posts content related to, suicide, self-harm, eating disorders or body image.

Other platforms have also pushed back against the calls for further restrictions against underaged users.

Snapchat public policy head Henry Turnbull told the inquiry he did not support using technology to restrict children under 16 from using social media.

Ordered to act

Australia’s eSafety Commissioner Julie Inman Grant has given the tech industry until October 3 to create enforceable rules to stop children from viewing graphic pornographic content on social media platforms.

“Around the age of 13 is the average age that an Australian teenager will see pornography,” Inman Grant told 7NEWS.

“Thirty per cent of the children that we spoke with came across this graphic content accidentally when searching for a video on YouTube or searching the internet.

“You know, searching for My Little Pony and getting a different kind of pony ride.

“It sounds funny but it isn’t when you’re talking about a child that is just not equipped to see that kind of content.

“All they need to do is open up a browser on a smartphone and that’s an entry way into a very graphic, extreme and violent pornographic experience. A lot more choking, lots more asphyxiation.

“And when you’re talking about young children, they don’t know how to comprehend what it is they’re seeing and understand what it is (that’s) happening.

“Obviously, this can have very harmful impacts on them as individuals but (also) collectively, as an entire generation, if we don’t get ahead of this.”

Australia’s eSafety Commissioner Julie Inman Grant has given the tech industry until October 3 to create enforceable rules to stop children from viewing graphic pornographic content on social media platforms.Australia’s eSafety Commissioner Julie Inman Grant has given the tech industry until October 3 to create enforceable rules to stop children from viewing graphic pornographic content on social media platforms.
Australia’s eSafety Commissioner Julie Inman Grant has given the tech industry until October 3 to create enforceable rules to stop children from viewing graphic pornographic content on social media platforms. Credit: 7NEWS

Inman Grant has ordered social media companies, search engines, app stores and gaming sites to individually come up with industry codes that will prevent children from unintentionally seeing inappropriate material.

“It requires every single sector to do something to put some robust and reasonable protections in place so that there isn’t a single point of failure so that our children are protected at each level,” she said.

The watchdog wants safeguards including age checks, tools to filter or blur unwanted sexual content, and parental controls.

She says she will impose mandatory standards if the industry codes presented are not strong enough.

“We’ve got a lot of huge conglomerates that have multiple services and centralised IDs, like an Apple ID,” Inman Grant said.

“Those safety settings should follow that child through all of their experiences within that walled garden, and should be age appropriate throughout their journey on technology.

“These companies are the richest in the world, they have access to the best minds and the most advanced technologies — they can target our advertising with deadly precision.

“They can certainly put a few protections in place to prevent the incidental incidents of our children coming across

“There’s just too much at stake for them not to act.”

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Swift Telecast is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – swifttelecast.com. The content will be deleted within 24 hours.

Leave a Comment