Translate the following paragraphs into Chinese
Now we know for sure that big tech peddles despair, we must protect ourselves
Now that the inquest into the awful death of Molly Russell in 2017 has delivered
its findings, we have a new reality to adjust to. The teenager died from an act
of self-harm, “while suffering depression and the negative effects of online
content”. Her father described how she had entered “the bleakest of worlds”:
online content on self-harm and suicide was delivered in waves by Instagram and
Pinterest, just leaving it to the algorithm. “Looks like you’ve previously shown
an interest in despair: try this infinitely replenishing stream of fresh
despair.”
Social media platforms deliberately target users with content, seeking attention
and therefore advertising revenue: we knew that. This content can be extremely
damaging: we knew that, too. But surely now that we’ve struggled, falteringly,
towards the conclusion that it can be deadly, there can be no more complacency.
These are corporations like any other, and it’s time to build on the consensus
that they cause harm by regulating, as we would if they were producing toxic
waste and pumping it into paddling pools.
People, parents especially, worry a lot about the digital age and its impact on
teenagers, and a lot of those worries are nonsense: are they addicted to Fifa?
Will Minecraft turn them into recluses or sever their connection with the
natural world? Does Fortnite stop them reading books (in fact, yes, but some
other time for that)? Sometimes you’ll get a useful correction from a specialist
in addiction or adolescence but there isn’t a coherent pushback from tech
giants, because these anxieties create exactly the debate they need, amorphous
and essentially luddite in character: what if today’s kids are less resilient
than yesterday’s because they were raised in a world with different stimuli? If
the real threat to kids is modernity itself, it can never be addressed, it can
only be discussed.
Underneath all that noise is a persistent drumbeat, an agenda now well known,
pursued by methods that have been widely studied. Any platform that is free to
use exists to maximise its advertising revenue, which means chasing watchers and
watch-time. The algorithms suggesting content are not designed to prioritise
quality or relevance, but rather to take an existing interest in any given user
and direct them, in Molly Russell’s case, to more extreme versions of it. This
had the tragic outcome with Molly that she was bombarded by more and more
explicit explorations of misery, such that the coroner, Andrew Walker, said: “It
would not be safe to leave suicide as a conclusion.” We cannot seal off a death
from despair as an individual act when there are global corporations
unrestrainedly marketing despair.
The problem goes far beyond young people: we can see algorithm impacts in
nativist politics all over the world, and in that regard, youth is not the
defining factor – indeed, the casual characterisation of youth as a state of
vulnerability is its own blind alley. Nevertheless, there are two elements that
make social media particularly influential on the young, and the behemoths of
the field particularly culpable in their failure to address the problem. As
Laura Bates notes in Men Who Hate Women, her detailed research into the “manosphere”,
the social media coverage of Gen Z is astronomical: 85% of US teens use YouTube,
72% use Instagram, 51% still use Facebook. People spend significantly more time
watching content that’s been recommended than stuff they’ve gone looking for: on
YouTube, 70% of everything watched has been suggested by the site.