Forum search & shortcuts

Just seen someone b...
 

Just seen someone be killed on YouTube shorts.

Posts: 5434
Free Member
Topic starter
 
[#13104230]

I’ve just been shown a video on YouTube shorts that looks like someone being killed by a propellor that has come loose.

I’m aware that it may be fake, but it didn’t look like it to me, and anyway that’s not particularly the point.

Unsurprisingly it’s not something that I’ve shown an interest in on YouTube previously. Most of my searches are around mountain biking and bikepacking videos.

Frankly, I’ve seen enough people dying in real life when I was a junior doctor on the trauma team in a large city hospital.

I’m just staggered that it can serve up something so graphic (and potentially traumatising) without any warning whatsoever.

So much for “don’t be evil”.


 
Posted : 06/01/2024 9:30 am
jamj1974, gowerboy, bol and 5 people reacted
Posts: 2882
Free Member
 

That’s pretty rough. Did you manage to flag / report the video? <br /><br />

i can’t fathom folk who post / want to consume that sort of content at all. 


 
Posted : 06/01/2024 9:40 am
Posts: 35221
Full Member
 

Bloody hell, that's a bit nightmareish. Might be worth flagging if you think its real.


 
Posted : 06/01/2024 9:44 am
Posts: 18615
Free Member
 

Russian driving vids excepted


 
Posted : 06/01/2024 9:45 am
ayjaydoubleyou, footflaps, ayjaydoubleyou and 1 people reacted
Posts: 5434
Free Member
Topic starter
 

I’ve reported it.

The only thing I can think of was that yesterday I watched a video of chimpanzees hunting baboons that was pretty gruesome in an interesting way. Not something that I would want a child or someone of a nervous disposition to see, but comes under “nature, red in tooth and claw” sort of category.

It’s just so irresponsible of the platforms not to do more to stop this.


 
Posted : 06/01/2024 9:50 am
Posts: 1130
Free Member
 

It’s just so irresponsible of the platforms not to do more to stop this.

You’re not wrong, but [i]how[/i] is the problem. There is something like 500 hours of content uploaded to YouTube [b]every minute[/b]. It’s certainly not possible to watch it all. Machine learning will get to a point where it can cover most things, but after that they can only rely on people reporting material. And even then, the volumes reported are near-on impossible to process manually.


 
Posted : 06/01/2024 10:00 am
ayjaydoubleyou, fasthaggis, dyna-ti and 5 people reacted
Posts: 4180
Free Member
 

Multiply that by most of Gen Z and you can start to see where we are getting contributory factors to the current youth mental health crisis.

The big platforms should be banned.End of.


 
Posted : 06/01/2024 10:00 am
Posts: 78655
Full Member
 

The big platforms should be banned.End of.

Because history has proven time and again just how effective prohibition is.

YouTube may not be ideally regulated, but regulated it is. It is by any measure 'safer' than anything you might trip over on the dark web.


 
Posted : 06/01/2024 10:10 am
dc1988, ayjaydoubleyou, acidchunks and 23 people reacted
 poly
Posts: 9167
Free Member
 

The big platforms should be banned.End of.<br /><br />

because multiple small platforms would be better? I think then you drive niche weird crap into apps ordinary people have never heard of so never see the possible content etc.

Showing the graphic death of someone, even in an accident or a fake video, will be against YouTube rules and typically see a video pulled, potentially the account frozen, maybe even the user banned.

Managing this is a problem government seem to struggle with and knee jerk reaction like banning it, is the simplistic view of politicians who don’t understand technology and feel the need to appeal to the public with “solutions”.  In reality if the person who posted it is in the U.K. they will have likely committed an offence for posting a grossly offensive message via a communications network.


 
Posted : 06/01/2024 10:10 am
Posts: 477
Free Member
 

Maybe they could just not allow any content to be shown unless it has been robustly checked.

But they choose not to do that as it’s too hard / they want to make money.


 
Posted : 06/01/2024 10:25 am
kelvin and kelvin reacted
Posts: 5830
Full Member
 

It's not that they "choose not to", it's that it's simply not possible!  See @bensales staggering statistic above:

"500 hours of content uploaded to YouTube every minute". 

I CBA with the maths but how many tens of thousands of employees would you need to pay to review all that?


 
Posted : 06/01/2024 10:33 am
fasthaggis, footflaps, fasthaggis and 1 people reacted
Posts: 23635
Full Member
 

I suppose the thing to remember is - Youtube doesn't create any one the content - someone decide to upload that content, it wasn't YouTube's idea. It's a platform for other people's content. So someone created and posted that video and theres no process by which Youtube scrutinises that content before it's uploaded or before anyone views it. As with all social media - that process of scrutiny basically is outsourced to the rest of us and only starts once the content is already public - theres no mechanism for the platform to act until the content has been seen by the public and reported back to them. So their content moderation is basically on of shutting the gate after horse has bolted.

There seemingly has been a shift in YouTube's reccomnedation system recently though - up until a few months ago the way content was offered up seened to tend towards fuelling YouTube 'stars' - the stuff offered up was done so on a mix of factors that considered things you've seen and searched for before and the most viewed videos on the platform. And you can see why there is a presumption that your would want to see something that everyone else is watching. You could argue that it steered viewer towards a handful of very successful channels and made it difficult for any new venture to get started though. And the sort of point of YouTube is its somewhere to see and show anything and everything, not just the voices and faces of a few.

That seems to have flipped - I curiously get videos offered to me now that have had dozens of views in the decade since they were uploaded and I've seen items by YouTubers saying the metrics of their channel / content is now very odd - with their content clearly being promoted to wider demographics but getting a very low engagement as a result. Its not really clear what the point of this shift is but the result is weird random crap finds itself in front of more people

I’ve reported it.

What's quite grim is what happens next when you do that - there was an excellent Storyville ocumentary(not currently availnbe on iplayer unfortunately) about the outsourced teams that do the content moderation for the big social media companies - people who's job it is to view successive images and films of porn, abuse, violence and death when we click 'report'-a guy who's seen so many ISIS beheading videos that he can view a picture of a corpse and know how sharp the knife was


 
Posted : 06/01/2024 10:34 am
footflaps and footflaps reacted
Posts: 477
Free Member
 

It’s not that they “choose not to”, it’s that it’s simply not possible!<br /><br />

It’s not possible if they have the policy that anybody can upload / uploads are instant / whatever their current policy is. But if they changed their policy so that nothing could find its way online until it had been robustly checked, then such damaging content wouldn’t find its way onto YouTube. <br /><br />

But if they took this approach, they wouldn’t be able to continue making the same amount of money they currently do. So the consequences of damaging content getting online is basically seen as collateral damage, whilst they continue to make money. 


 
Posted : 06/01/2024 10:40 am
onewheelgood, kelvin, mogrim and 3 people reacted
Posts: 23635
Full Member
 

It’s not possible if they have the policy that anybody can upload / uploads are instant / whatever their current policy is. But if they changed their policy so that nothing could find its way online until it had been robustly checked, then such damaging content wouldn’t find its way onto YouTube.

You typed that paragraph clicked submit and it appeared instantly on this moderated forum. Should the nature of this forum be every sentence is viewed and vetted at every step of the conversation before being published?


 
Posted : 06/01/2024 10:43 am
blokeuptheroad, theotherjonv, footflaps and 3 people reacted
Posts: 477
Free Member
 

If the owners of STW recognised that there was a problem with damaging content being able to be instantly uploaded to their platform, then they would need to make a decision if they should continue with their platform in its current state. I don’t believe that is an issue with STW, but clearly it is with YouTube, Facebook etc


 
Posted : 06/01/2024 10:49 am
kelvin and kelvin reacted
Posts: 7563
Free Member
 

*deleted by moderator*


 
Posted : 06/01/2024 10:51 am
Posts: 7082
Full Member
 

At the dawn of video there was a rumours of some snuff movies.


 
Posted : 06/01/2024 10:56 am
Posts: 23635
Full Member
 

I don’t believe that is an issue with STW, but clearly it is with YouTube, Facebook etc

I had to raise an issue with Mark many years ago when one of STW's ad servers provided me with a lovely image of a guy who'd had the lower half of his face torn off in a motorcycle accident


 
Posted : 06/01/2024 10:58 am
Posts: 5434
Free Member
Topic starter
 

You typed that paragraph clicked submit and it appeared instantly on this moderated forum. Should the nature of this forum be every sentence is viewed and vetted at every step of the conversation before being published?

This is a false comparison, as nobody is being shown stuff on this forum by an algorithm, we're all choosing it.

There's an argument to be made that if your (YouTube, Facebook etc) business model relies on choosing what content to show people in order to make a profit, then you should be responsible for making sure that content isn't harmful.

IMV when social media started curating what to show people through algorithms, they stopped being just a platform and stepped over the line into being publishers, with all that that entails.

The US will never regulate it, after all they're mostly US companies, but that doesn't mean the rest of the world shouldn't.


 
Posted : 06/01/2024 11:04 am
kelvin and kelvin reacted
Posts: 4180
Free Member
 

I know my response was a simplistic response.

I know its not going to happen - just like with nuclear weapons, the genie is out of the bottle.

I guess this is just a bit of a raw subject for me as I have a neurodiverse child addicted to doom scrolling and I'm watching it reduce them in every way. and its tearing me apart. To think that people are making huge sums of money out of this makes me angry.


 
Posted : 06/01/2024 11:05 am
Posts: 424
Free Member
 

Don't use Instagram Reels if you don't like seeing death, every 7-10 videos on there for me is someone getting killed and the contents not always flagged with the "sensitive content click see reel to watch anyway" marker


 
Posted : 06/01/2024 11:06 am
Posts: 13594
Free Member
 

IMV when social media started curating what to show people through algorithms, they stopped being just a platform and stepped over the line into being publishers, with all that that entails.

They've always used algorithm to curate what they show you as they can't possible show you everything...


 
Posted : 06/01/2024 11:11 am
 5lab
Posts: 7926
Free Member
 

500 hours per minute means 30,000 people needed round the clock, or around 140,000 full time employees, if you only watched each video once. Assuming a need for training, a need for some videos to be viewed multiple times etc, it's probably around 200,000 employees needed to moderate all the content. On top of that you'd need a management structure, maybe another 20,000 people, plus IT support, cleaners, office space, etc. It's just not feasible.


 
Posted : 06/01/2024 11:35 am
Posts: 9656
Full Member
 

Must say Instagram is bad if you click search and random stuff comes up.


 
Posted : 06/01/2024 11:45 am
Posts: 1031
Free Member
 

Don’t use Instagram Reels if you don’t like seeing death, every 7-10 videos on there for me is someone getting killed

If this is true, and you can’t stop it despite the reports… Why the F. are you still on instagram? Your support (by viewing non death stuff) is ultimately allowing this. The mind boggle. Sack it off, you’ll be thankful for the time back if nothing else.


 
Posted : 06/01/2024 11:51 am
Posts: 23635
Full Member
 

Against 2.7 billion active monthly users for YouTube globally 200,000 is a pretty small number. It's not that there aren't moderators - there are 10s of thousands of them, all be it at arms length so they don't really show up on Google's / Meta's or whoever's rosta. What would change the nature of that work for the people that have to do it is that for the large part knowing that the content is going to be checked would give posters pause for thought. Putting something horrible up so that its there for as long as you can get away with is different to uploading something that you know won't get past moderation - moderation would require far less intervention.


 
Posted : 06/01/2024 11:54 am
Posts: 1230
Full Member
 

500 hours per minute means 30,000 people needed round the clock, or around 140,000 full time employees, if you only watched each video once. Assuming a need for training, a need for some videos to be viewed multiple times etc, it’s probably around 200,000 employees needed to moderate all the content. On top of that you’d need a management structure, maybe another 20,000 people, plus IT support, cleaners, office space, etc. It’s just not feasible.

It *is* feasible. For certain it's a lot of people, but there's nothing about it that would make it unfeasible. 250k employees isn't even all that big in the grand scheme of things.

The question really is whether it's worth it, not whether it's practically possible.


 
Posted : 06/01/2024 12:05 pm
 poly
Posts: 9167
Free Member
 

This is a false comparison, as nobody is being shown stuff on this forum by an algorithm, we’re all choosing it.<br /><br />

it’s not a completely false comparison, I could start a thread right now with an innocuous and intriguing title and. Include in that any sort of evil I wanted until it was reported / removed.  The more people engage with it to write “that’s ridiculous, reported” the more it stays at the top of the page until a mod gets to it.  It not sophisticated but it’s the same idea.  On some other forums if you are reading a thread it recommends other threads that look similar - again an algorithm.

your point about “validation” of the algorithm is an interesting point though.  If I was asked to design a “safe” algorithm I’d have weightings for the users who post, the number of times this video has been watched and reports received etc and that would factor in to how the videos were propogated to new users.  BUT if you got some totally random irrelevant content it probably has already been seen by lots of people who haven’t reported it - the judgement is not about YouTube but about other users. <br /><br />

you aren’t required to use youtube (or to watch stuff it suggests - turn auto play off)

you definitely aren’t required to watch shorts

I watch a lot of YouTube but very little shorts.  So I just did a test - of the first twenty videos it shows: 13 were from channels I either subscribe to or watch fairly often; 4 were from closely related channels; 2 seemed to be adverts; 1 was a bit “random” but was just a bit of bizarre weirdness - it was certainly in no way offensive and I’d probably have watched it all the way through if I wasn’t scrolling to the next one to write this summary.  Obviously it’s not your fault if you are getting served content you don’t want - but like the people who tell me Facebook is full of people having political arguments - it’s not, Facebook has decided they want political arguments and Google have decided you want to see nasty shit (whereas my Facebook if full of family, club news etc and and my you tube is taskmaster outtakes, would I lie to you clips, educational science content etc).


 
Posted : 06/01/2024 12:17 pm
Posts: 1130
Free Member
 

Don’t use Instagram Reels if you don’t like seeing death, every 7-10 videos on there for me is someone getting killed and the contents not always flagged with the “sensitive content click see reel to watch anyway” marker

Whereas I get a few dashcam videos, a whole bunch of weight lifting stuff, some motorbike stunt riders, and a shitload of cat videos thanks to my daughter.

I don’t think I’ve ever seen a sensitive content warning on there. So if it’s recommending you such stuff, it’s doing it because you’ve watching them in the past.


 
Posted : 06/01/2024 12:24 pm
Posts: 21027
 

The question really is whether it’s worth it,

No.

250k employees on minimum wage is just under £5bn in salary alone, never mind all the other costs. As big as YouTube is (25bn in revenue p/a, not sure what the profit on that is), I’m not sure even they could afford that.


 
Posted : 06/01/2024 12:26 pm
Posts: 7872
Full Member
 

Don’t use Instagram Reels if you don’t like seeing death, every 7-10 videos on there for me is someone getting killed

Aren't those things only 30s long? So you are seeing the death of a person every 3.5 to 5 minutes?

I got rid of it because it was showing vids of girls showing their underwear which is surely less damaging than people being killed.


 
Posted : 06/01/2024 12:29 pm
Posts: 1130
Free Member
 

250k employees on minimum wage is just under £5bn in salary alone, never mind all the other costs. As big as YouTube is (25bn in revenue p/a, not sure what the profit on that is), I’m not sure even they could afford that.

Of course they could. My employer has somewhere between 25 and 30 billion euro revenue a year, about 2 billion profit, and has nearly 300k employees who get paid a hell of a lot more than minimum wage.


 
Posted : 06/01/2024 12:31 pm
Posts: 5434
Free Member
Topic starter
 

I got rid of it because it was showing vids of girls showing their underwear

You got rid of it because of that? 😉


 
Posted : 06/01/2024 12:31 pm
Posts: 21027
 

Of course they could. My employer has somewhere between 25 and 30 billion euro revenue a year, about 2 billion profit, and has nearly 300k employees who get paid a hell of a lot more than minimum wage.

so could you afford an additional 250k employees, which is what’s being asked here?


 
Posted : 06/01/2024 12:35 pm
Posts: 7872
Full Member
 

You got rid of it because of that?

On my opticians advice.


 
Posted : 06/01/2024 12:37 pm
oceanskipper, stingmered, stingmered and 1 people reacted
Posts: 11605
Free Member
 

Nobody ever saw Rotten back in the day?

IMV when social media started curating what to show people through algorithms, they stopped being just a platform and stepped over the line into being publishers, with all that that entails.

Nope, unless it's P2P then same rules apply. None of those things have ever been allowed on the platform but it happens. People post stuff on here all the time that breaks the rules and it gets removed by the same process.

Don’t use Instagram Reels if you don’t like seeing death, every 7-10 videos on there for me is someone getting killed and the contents not always flagged with the “sensitive content click see reel to watch anyway” marker

Literally never seen this.


 
Posted : 06/01/2024 12:39 pm
Posts: 23635
Full Member
 

The question really is whether it’s worth it,

No.

250k employees on minimum wage is just under £5bn in salary alone, never mind all the other costs. As big as YouTube is (25bn in revenue p/a, not sure what the profit on that is), I’m not sure even they could afford that.

Like a lot of the Internet Youtube runs on a free to view to use basis funded by advertising (which frankly is often alarmingly badly moderated too) and an option to pay for an ad-free experience. The consequence of poor moderation can be felt by all users (free or paid) but it's not caused by all users, only by the ones that post content. It's the content creators / uploaders / re-uploaders that create the burden. 500 hours per mimute of uploaded content and most of that content will only be a few minutes long.

What if an upload cost a quid? Would people post content they know will be taken down soon if it cost them a bit of money, would people post illegal content if it involved a traceable transaction rather than a burner email account? The burden of moderation could be both reduced and self funding.


 
Posted : 06/01/2024 12:39 pm
Posts: 5434
Free Member
Topic starter
 

@bensales

So if it’s recommending you such stuff, it’s doing it because you’ve watching them in the past.

Not necessarily. If the algorithm detects that you're a passive consumer of content, it'll take you quite quickly into extreme stuff. It's a known feature of these algorithms.

As mentioned in my original post, I've seen more than enough death IRL that I'm not in any way curious about it. I don't think it should be shown as entertainment because it's disrespectful. Most of my YouTube shorts (which crosses over with Tik Tok and Instagram reels I believe) have been mountain biking, patisserie making and barbecuing.

As mentioned, the only thing I can think of is the video that I saw yesterday of the Chimpanzees and the Baboon, which I let repeat a few times because I was checking what I'd just seen.


 
Posted : 06/01/2024 12:40 pm
Posts: 24
Full Member
 

There are platforms which don't use algorithms, instead you choose what you see. It doesn't solve the need for moderation but it does mean you only get stuff you search for or from providers you trust.
I don't know if governments will ever get to grips with it, but is there mileage in regulating what the algorithms are designed for?


 
Posted : 06/01/2024 12:41 pm
Posts: 1130
Free Member
 

so could you afford an additional 250k employees, which is what’s being asked here?

Google (the evil empire behind YouTube) employ about 160k people.

Their revenue is currently around 280 billion dollars.

Their profit is something like 60 billion dollars.

They can afford a few more staff if they want to. Larry, Sergey and Sundar might have to forego new yachts.


 
Posted : 06/01/2024 12:47 pm
Posts: 23635
Full Member
 

As mentioned, the only thing I can think of is the video that I saw yesterday of the Chimpanzees and the Baboon, which I let repeat a few times because I was checking what I’d just seen.

It could just as easily be a manipulation of tags and other identifying criteria by the uploader. If someone though it was funny to shock the unsuspecting they could upload videos of death and mutilation and tag it a patisserie and home baking


 
Posted : 06/01/2024 12:48 pm
Posts: 4180
Free Member
 

"I don’t think I’ve ever seen a sensitive content warning on there. So if it’s recommending you such stuff, it’s doing it because you’ve watching them in the past. "

That is absolutely not true.

The hook systems used are wide, varied and amazingly good at radicalizing the viewer for want of a better phrase. I.e drawing them away from the shallows and into deep water bit by bit, vid by vid. The more you scroll the more they learn. Just hovering for a split second longer on something will be enough of a trigger.

I'm not sure everybody here, especially those without children realise just how much time children and young adults can spend on these insideous sites. Obviously they platforms just want you to watch content - they don't care what it is. So if the algorithms decide a particular account gets more screetime with cat videos its unlikely that viewer will end up with violent content. But the moment the screen time drops they will be trying something else and almost always the end result is more extreme. The human brain is designed to constantly recalibrate a baseline - its the only way we can cope but this means we are very good at desensitizing ourselves in the short term.....at the huge expense of trauma in the long term.


 
Posted : 06/01/2024 1:02 pm
 poly
Posts: 9167
Free Member
 

They can afford a few more staff if they want to. Larry, Sergey and Sundar might have to forego new yachts.

lots of people who use this forum would be adversely affected too - alphabet will be a significant part of many people’s pension portfolios.  Easy to point the finger at “big corporate” but like it or not it’s not quite as simple as telling them to behave better.  


 
Posted : 06/01/2024 2:21 pm
Posts: 4519
Full Member
 

This is just one of the many unpleasant consequences of our expectation that stuff on the internet should be free. Every previous content delivery system was paid for like books, magazines, movies, LPs or ad-funded by ads that were only very crudely targeted - commercial TV and Radio, free newspapers. I've watched the evolution of the internet since the beginning and watched it become more and more corrupted by the greed of the big players. What started out as the democratisation of access to information and communication has become what we see today. It's tragic, but the genie isn't going to get back in the bottle and we will just have to figure out ways to live with it.


 
Posted : 06/01/2024 2:27 pm
 poly
Posts: 9167
Free Member
 

Not necessarily. If the algorithm detects that you’re a passive consumer of content, it’ll take you quite quickly into extreme stuff. It’s a known feature of these algorithms.

I’m not sure what a passive consumer of content is.  By the sounds of it, yesterday you were slightly less passive by repeatedly rewatching a video that many would find a bit gruesome.  My understanding is the algorithm now thinks* that you will probably like other videos that other people who also watched/liked/shared/rewatched that video also liked.  If you’ve fallen into a pool with the young lad types who share gruesome content that would explain it.  Certainly with main YouTube you can tell it you don’t want to see a particular video or channel and that has an impact on the algorithm in the other direction. (Eg if you say don’t show me this - on a patisserie channel you’ll start seeing less fine cooking content).

* the algorithm of course doesn’t think at all - we anthropomorphise it because it’s easier than accepting you’ve been manipulated to watch something by a set of mathematical calculations with absolutely no actual insight into who you are or indeed what the videos are about.


 
Posted : 06/01/2024 2:34 pm
Page 1 / 2