Social media giants publish articles on self-harm, adults and emergency diets to children as young as 13

Children as young as 13 are targeted online with adult sexual images and content promoting suicide and crash diets within 24 hours of creating social media accounts, an investigation found .

The study, which will be released on Tuesday, found that commercial-oriented algorithms in social media companies were directing ads to children, while increasingly recommending adult and harmful content to them in an attempt to monetize their accounts. .

It included sexual content from pornographic sites, adult contact requests, material on self-harm and suicide, and idealized bodies “so impractical they distort any idea of ​​what a body should look like.” .

This was discovered by Revealing Reality investigators, for the 5Rights Foundation, who created 10 fake accounts, or “avatars,” which mimicked children between the ages of 13 and 17 creating social media accounts.

Within 24 hours, the accounts were directly targeted with harmful content by companies such as Facebook, Instagram and TikTok; and received messages direct or added to group chats by strangers linking them to sites with paid or pornographic content.

“Deep recklessness and contempt for children”

Dame Rachel de Souza, the Commissioner for Children, will call on Tuesday for urgent action by government and industry to “create an online world fit for children”.

“This research highlights the huge range of risks children face online. We do not allow children to access services and content inappropriate for them, such as pornography, in the offline world. They shouldn’t be able to access it in the online world either, ”she said.

Ian Russell, whose daughter Molly committed suicide after being bombarded with self-harming content on Instagram, said research has shown that “algorithmic amplification actively connects children to harmful digital content, unfortunately as I don’t know it all too well, sometimes with tragic consequences ”.

Baroness Kidron, President of the 5Rights Foundation, said: “What this Pathways research highlights is a deep neglect and disregard for children, embedded in the features, products and services of the digital world. . “

Content designed to increase screen time

The designers told researchers they were tasked with creating services that maximize the time spent online, the site’s reach to attract as many people as possible, and user activity by encouraging them to interact with and to generate as much content as possible.

This meant that children’s attention was boosted with push notifications, endless scrolling streams, likes to quantify their popularity, content shared, in-app or in-game purchases, and easy links to connect. with friends or followers.

Child avatars were not only targeted by age-appropriate ads, but also ran with sexual content, adult contact requests, self-harm and suicide material, emergency diets and other extreme body image content.

In one screenshot, a child not only receives ads for Nintendo Switch, candy and teenage stamps, but at the same time pro-suicide material titled “It’s Easy To End It.”

Another avatar registered as a 15-year-old is targeted by an Home Office ad with a campaign against child abuse while being offered adult contact and content in a pornographic pose.

“The company monetized this child account, but they still recommended, rated, rated, or offered material that in many cases violated its own terms and in any case should not have been offered to a user registered as a child, “said the Baroness. Kidron.


Source link

About Ricardo Schulte

Check Also

The cost of ad variations – Film Daily

You are about to spend a lot of money on advertising. Now depending on the …

Leave a Reply

Your email address will not be published. Required fields are marked *