Pages

Thursday, June 28, 2018

Essentially what the seemingly benign “if you like that, you’ll like this” mechanism is doing is training young children – practically from birth – to click on the first thing that comes along, regardless of the source.

In November of last year, I read an article in the New York Times about disturbing videos targeted at children that were being distributed via YouTube. Parents reported that their children were encountering knock-off editions of their favourite cartoon characters in situations of violence and death: Peppa Pig drinking bleach, or Mickey Mouse being run over by a car. A brief Google of some of the terms mentioned in the article brought up not only many more accounts of inappropriate content, in Facebook posts, newsgroup threads, and other newspapers, but also disturbing accounts of their effects. Previously happy and well-adjusted children became frightened of the dark, prone to fits of crying, or displayed violent behaviour and talked about self-harm – all classic symptoms of abuse. But despite these reports, YouTube and its parent company, Google, had done little to address them. Moreover, there seemed to be little understanding of where these videos were coming from, how they were produced – or even why they existed in the first place...

...Take YouTube’s recommendation system for starters, which doesn’t differentiate between Disney movies and a grainy animation cooked up by a bot farm in China. Essentially what the seemingly benign “if you like that, you’ll like this” mechanism is doing is training young children – practically from birth – to click on the first thing that comes along, regardless of the source. This is the same mechanism that sees Facebook slide fake political ads and conspiracy theories into the feeds of millions of disaffected voters, and the outcome – ever more extreme content and divided viewpoints – is much the same. Add the sheer impossibility of working out where these videos come from (most are anonymous accounts with thousands of barely differentiated uploads) and the viewer is adrift in a sea of existential uncertainty, which starts to feel worryingly familiar in a world where opaque and unaccountable systems increasingly control critical aspects of our everyday lives...

https://www.theguardian.com/technology/2018/jun/17/peppa-pig-youtube-weird-algorithms-automated-content?CMP=Share_iOSApp_Other

1 comment:

  1. "Educated consumers, are our best customers."

    ReplyDelete

If your comment does not appear in 24 hours, please send your comment directly to our e-mail address:
parentscoalitionmc AT outlook.com