Young people from Wales are calling on social media platforms for better reporting of harmful content.
Students at Pontypridd High School, Rhondda Cynon Taf asked questions about TikTok’s and Meta’s bosses.
Platforms stated that they listen to users’ feedback and encourage them use safety features.
Alex Davies-Jones MP organized the event as the UK’s online safety bill passes through Parliament.
Caitlin, 17, said that “there’s barely any day that goes by that it doesn’t happen to me something negative.”
“I believe social media is pretty dangerous,” says one of my friends.
“Scrolling through TikTok yesterday, there are many challenges aimed to young girls to consume approximately 300-400 calories per day …. It could also trigger eating disorders.
Caitlin has failed many times to report harmful content.
“I believe the reporting process needs to be improved. It takes so long to get things down and by the time they’re taken down, there has been damage.”
TikTok stated that it removes all content that encourages disordered eating.
Caitlin was subject to online abuse following a post about Mason Greenwood, Manchester United Footballer, being accused of rape. He denied the claims.
“I talked about the Mason Greenwood situation, and was told that I should get beaten up about it.”
“I feel like social media has made it harder for women to be heard, and I feel like we are being pushed back a little.”
It’s almost like we are fighting a war. It’s almost like we can’t speak without being threatened with harm.
Others highlighted the positive aspects of social networking, like keeping in touch and supporting self-love campaigns.
Isabelle, 17, said that there are many body positive things now that they never were.
Students working on an online safety project were given the opportunity to interview tech titans during a virtual event.
“Young people believe it is difficult to report or remove disturbing content. What can you do to make this easier? Brooke, 13 years old, asked Brooke.
Alexandra Evans, TikTok’s European head of safety policy, stated that she believes the platforms’ reporting mechanisms are intuitive but would welcome feedback from users about how they are doing.
She also highlighted the blocking functions: “For instance, if you don’t like the word ‘hate’ or ‘loser’, whatever it may is, you can create a list that will filter your comments.
Megan Thomas is a public policy associate manager at Meta. Meta owns Instagram and Facebook. Megan said that the company recently created new features to “help people avoid experiencing any type of harmful content on our platforms.”
Poppi (13 years old) wanted to know how many offensive comments are removed each day and what penalties are in place for repeat offenders.
Both representatives stated that they didn’t know the daily number, but pointed to quarterly reports.
Ms Evans, TikTok’s spokesperson said that there is a range of harm and a spectrum in behavior. She also stated that they try to be specific in their responses.
“But when it comes down to the most egregious cases, those absolutely zero-tolerance behaviors, we all work together to ensure that we respond and stamp out this kind of activity across all our platforms.”
The UK government is introducing new online safety laws. However, Alex Davies-Jones, Labour MP for Pontypridd, and shadow technology minister Alex Davies Jones warned of “loopholes” in the legislation.
A spokesperson for the UK government stated that “our pioneering Online Safety Bill” will already bring major improvements in the safety of women, girls and prevent cyber flashing from being criminalized.
They also stated that social media companies may be held responsible for failing to take action and could face heavy fines.