Entertainment Companies Use Social Media to Reach Minors with Mature-Rated Content, New Study Finds

Lauren Shank | November 17, 2022 | 10:26am EST
Text Audio
00:00 00:00
Font Size
(Getty Images/Robert Alexander)

(CNS News) ­– New research by the Parents Television and Media Council (PTC) documents how easily 13 to 17-year-olds can access dangerous, explicit and adult-themed content on social media platforms, fueled by Hollywood marketing strategies.

The frequent use of social media, combined with algorithms that cater feeds to users’ interests makes media platforms addictive and “dangerous – especially for young people,” a PTC press release warns.

According to a Pew Research Center survey on teens, social media and technology, 97% of teens under 18 said they use the internet daily and 46% of teens said they use the internet almost constantly. Specifically looking at TikTok, 67% of teens actively use the app.

“Companies, including entertainment companies, exploit teens’ fear of missing out and social anxiety to increase engagement around their product through Social Media Influence Marketing,” the PTC press release stated.

PTC studied first-hand how media companies market their mature-rated content on platforms most popular with teens. Both Instagram and TikTok accounts were created for a “13-year-old.” Although Instagram did not require age verification, TikTok had a parent access option - which could be bypassed without linking an adult to the account.

Through both accounts, the analysts searched for hashtags with shows rated “mature audience,” specifically ones featuring child-aged characters engaging in “violent or sexually explicit scenes.” 13 Reasons Why, A Teacher, Big Mouth, Euphoria, Panic, pen15, Sex Education and Squid Game were all shows the “13-year-old” account was able to engage and access without any barriers:

“Even though the analyst was logged-on as a 13-year-old, the ‘#bigmouthnetflix’ led to troubling material, including a video of a young girl hugging people in ‘hormone monster’ costumes and being given a penis-shaped lollypop with a ‘Big Mouth’ sticker on it, and user uploaded content from the show featuring an adolescent character asking his girlfriend, ‘Are you saying I’m bad at fingering?’”

Meanwhile, Panic on Amazon Prime Video, a series about teenagers who participate in life-threatening dares to “escape their small town,” is easily accessible to an impressionable young audience and opens the door for a “copy-cat behavior,” PTC found. The issue lies in its marketing on a platform “that is being sued for wrongful death after two girls died allegedly trying to participate in one TikTok fad: The choking game.” The lawsuit alleges that both an 8 year-old girl and a 9-year-old girl were fed videos through the algorithm – of a “Blackout Challenge,” which encouraged users to strangle themselves.

“Media companies defend their production and distribution of adult-themed entertainment by pointing to their age-based content ratings, ostensibly shielding themselves from condemnation by insisting parents are responsible if children consume explicit programming,” the PTC press release states:

“Yet it is unmistakably clear from the findings of this report that those very same media companies are actively and intentionally usurping parental authority by marketing their age-inappropriate mature-rated content directly to children – especially to preteens and teens – on the social media platforms most used by children.”

PTC notes that the increase in depression, suicide and self-harm among the youth, mimics the childhood increase in social media screentime. For children 10 to 14 years of age, suicide is the second-leading cause of death. The National Institutes for Health found that the television show, 13 Reasons Why, was associated with a 28.9% increase in suicides among 10 to 17-year-olds in April of 2017.

A study conducted by Thorn, an international anti-human trafficking organization, revealed that, among kids who have shared nude photos or videos, 50% reported they had shared nudes with someone they had never met and 41% of kids who shared nudes reported they sent them to someone older than 18. The study also found that an increased number of 9 to 12-year-old children who have seen non-consensually re-shared nudes and were more likely to think nude-sharing is a common thing among kids their age.

“Perhaps this is so because of media messages telling them this is normal – messages from programs like ‘Big Mouth,’ ‘Euphoria,’ ‘Sex Education,’ ‘A Teacher,’ and other mature-rated titles focused on teenaged characters,” the PTC press release suggests.

In an effort to filter out explicit content on children’s’ devices, PTC is calling on parents to become more aware and active in the media-consumption habits of their young ones. The organization has also called on Hollywood and Big Tech to stop intentional marketing of explicit content to children through social media and for the Federal Trade Commission to reopen investigations “into the marketing of adult-rated content to children, focusing on the entertainment industry’s use of social media platforms as a marketing tool.”

PTC is also urging Congress to pass the Kids Online Safety Act (KOSA), introduced by Sen. Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.), which would require entertainment companies to establish a “duty of care” for children under 16, giving them tools to protect their data, shut off “addictive” features and not receive algorithm-based recommendations. Certain settings would be enabled by default for children and their parents, who have controls to help seek out dangerous behavior.

donate
mrc merch