Children and teenagers, like the rest of us, are spending more time online than ever before. In fact, the offline-to-online shift necessitated by the COVID-19 pandemic may have had a more profound effect on young people than anyone else, especially considering they have been the last to become eligible for the vaccines. Classes have moved online, entertainment has moved online, and the bulk of socialization has moved online, accelerating pre-pandemic trends.
With more time online comes more potential safety issues. It is our view, as a digital marketing agency, that the responsibility for protecting kids’ data and privacy lays predominantly with the platforms on which that time is spent, meaning Google, YouTube, TikTok, Instagram, etc.
Last week, Alphabet announced new policies to better protect underage users of Google and YouTube, including changes to its ad targeting tools. On YouTube, those changes will include updating the default upload settings to the ‘most private option available’ for kids aged 13-17, according to a recent post from the company. On Google, it will make SafeSearch, which filters explicit results, the default setting for kids under 18 with FamilyLink accounts.
On Google Ads, the company has pledged to expand safeguards to ‘to prevent age-sensitive ad categories from being shown to teens’ and will ‘block ad targeting based on the age, gender, or interests of people under 18.’ These changes will roll out in the coming months.
As a digital marketing agency, GrowthEngine Media supports any initiative to better protect the privacy of young internet users. Over the past several years, conversations about data protection and online security have become increasingly urgent. With more children than ever spending their days in front of the computer, everyone from marketing platforms to advertising agencies must insist that their security is a top priority.
Image: Shutterstock