Please wait...
Please wait...

PERSPECTIVE

Viewpoints from the frontline of content.

Down the tube

By Estelle Lloyd 20-02-2018

There’s no escaping the recent press exposing the safety, or otherwise, of kids’ content on the internet.

Headlines including ‘YouTube accused of “violence” against young children over kids content’ and ‘Something is wrong on the internet’ might have had parents throwing their kids’ tablets on to the bonfire, and not without cause. But who wants to break that news to their kids? Not many, and herein lies an issue.

YouTube was never intended for kids; its user-generated, unregulated content makes it a highly unsuitable service for young children. To a certain extent this also applies to YouTube Kids, which curates its service based on algorithms. And the fact remains that algorithms can’t catch everything, especially not on platforms that are designed to profit from direct and indirect advertising.

There’s a reason why some magazines are on the top shelf, why UK broadcasters impose a 21.00 watershed and why we direct our children to playgrounds in parks: we need safe spaces designed specifically for children. However, we continue to let them loose on the internet.

But criticising YouTube for not doing enough to protect children online is an easy position to take. Perhaps what we should be asking ourselves is why parents are letting their children access services that were never designed for young audiences in the first place.

Facebook’s Messenger Kids app

Most parents let their kids access YouTube because it offers a seemingly unlimited library of content and is, technically, free. Parents accept their children will watch videos as well as ads, neither of which will necessarily be appropriate, and that their viewing habits will be monitored and collected for profiling.

YouTube, as well as social media firms such as Facebook, own a hell of a lot of data about their audience. Age, birthday, location and even the information users share in private messages. These tech firms own all of this data and are free to use it however they want.

Facebook recently announced its Messenger Kids app for children under 13. Under the guise of tackling the issue of online safety on social media, it has developed a service that enables it to start building a profile of a child as early as it possibly can.

Let me be clear: this has nothing to do with Facebook’s concerns about protecting children from ‘fake news’ or restricting access to strangers. The question is, why aren’t more parents pointing their children in the direction of safe messaging services available for kids, built around a clear set of ethical rules to prepare them for their digital future, before they graduate to more grown-up services?

This could be about to change as parents gain a greater understanding of the risks the internet poses to children: trolling, clickbait and the negative impact of a highly commercialised environment. Many are realising their kids are not quite the ‘digital natives’ in the way they have been hailed to be. After all, using a device based on imitable behaviour does not mean their kids have a critical understanding of the technology they are using.

Of course, YouTube should do more to make its services safer for children. But no amount of content policing and self-regulation will ever guarantee that YouTube and YouTube Kids are completely safe. As a result, the regulatory landscape around these increasingly profitable companies needs to change.

If UK media regulator Ofcom can regulate and fine local broadcasters that schedule inappropriate content, why should the digital titans – which are much richer and more influential than them – escape any form of policing? It is clear the kids are not OK in the unregulated digital environment as it is currently designed.

The upside is there’s an increasingly diverse choice of safe, ad-free digital entertainment services designed for kids on offer to consumers. Coupled with parents who are increasingly aware of the negative effects of unfiltered video platforms and social media, the tide could quickly turn against businesses that profit from young viewers but do little to protect them.

Ultimately, the consumer is king: parents are free to make good and valid choices for their kids. Sometimes, in this fast-moving digital age, it’s easy to forget that.

today's correspondent

Estelle Lloyd Co-Founder Azoomee

Born in France, Estelle Lloyd spent most her career in New York and London. After early work in investment banking, she founded VB/Research in 2006. In 2014, she co-founded Azoomee, the app where kids can access hundreds of hours of content – TV, games, tutorial videos, audiobooks – in one secure place.

Azoomee offers tailored, age-appropriate content that changes and develops as the child grows. The UK’s largest kids’ charity the NSPCC is the company’s founding partner and advisor on safety and parental controls.



OTHER RECENT PERSPECTIVES