Rebuilding Trust in the Digital Playground: When the Internet Doesn’t Know Your Child Is a Child
- INQ Consulting

- Sep 26
- 3 min read

by Kaitlyn Hebert, Director, Privacy
There’s a quiet kind of trust we place in the digital world. We hand our children a device, watch them laugh at dog videos, lip-sync to pop songs, maybe even post their own clips—and we hope the platforms they use will treat them with care.
But a recent investigation by Canada’s privacy regulators into TikTok’s data practices has revealed just how fragile that trust can be.
Children—some far too young to be on the platform—were slipping past weak age gates. Once inside, their personal information was collected, analyzed, and used in ways they couldn’t possibly understand. Their faces scanned. Their voices analyzed. Their behavior tracked. All without meaningful consent.
It’s a sobering reminder: the internet doesn’t always know your child is a child.
What the Investigation Revealed
1. The Illusion of Age Verification
TikTok says kids under 13 (or 14 in Quebec) aren’t allowed on the platform. But the only barrier is a birthdate field—no real checks, no safeguards. Just a digital door that’s easy to walk through if you know what number to type.
What this means: Children are entering spaces designed for older users, and their data is being collected as if they were adults.
2. Data That Reveals Too Much
TikTok wasn’t just collecting basic info. It was gathering behavioral data that could hint at a child’s health, beliefs, or identity. All of this was used to personalize content and ads—without a clear reason, and without meaningful limits.
What this means: Kids are being profiled in ways that could shape how they see themselves thanks to the online world that gets presented to them—based on data they didn’t know they were giving away.
3. Consent That Doesn’t Count
TikTok’s privacy policies were hard to find, harder to understand, and not written for young eyes. The investigation found that consent—when it was asked for—wasn’t valid. And for children, it wasn’t even relevant, because the data collection shouldn’t have happened in the first place.
What this means: Children weren’t just uninformed—they were unprotected.
4. Biometric Data, Quietly Collected
Facial recognition. Voice analysis. TikTok used these tools to guess users’ age and gender. That’s biometric data—deeply personal, and potentially risky. But users weren’t told. There was no transparency, no clear explanation.
What this means: Kids’ faces and voices were turned into data points, without their knowledge.
Why This Matters
This isn’t just about TikTok. It’s about every platform that welcomes children without truly protecting them. It’s about the gap between what companies say they do—and what they actually do. We all share a responsibility to make sure kids aren’t just safe from harm, but safe from being invisibly shaped by algorithms and data brokers.
Children aren’t just smaller users, they’re vulnerable. They’re still learning who they are. And they deserve digital spaces that respect that.
The Call to Action: Let’s Rebuild the Trust We’ve Lost
This moment is a reckoning—but it’s also a rare opportunity.
We can build platforms that recognize children not as data points, but as developing minds. We can design systems that don’t just comply with the law, but honor the spirit of childhood—its vulnerability, its curiosity, its need for protection.
We can create digital spaces where:
Age assurance is real and respectful, not performative.
Consent is meaningful, not buried in legalese.
Data collection is minimal and purposeful, not exploitative.
Biometric technologies are used with transparency, or not at all.
And most importantly, children are treated as people to protect—not profiles to monetize.
If you build or manage digital platforms, this is your moment. Rethink your age gates. Rethink your consent flows. Rethink what data you really need—and what you should never touch.
One day, today’s children will look back at the digital world we gave them. Let’s make sure they see a place that respected their rights, honored their innocence, and protected their future.
Let’s make sure it was worthy of their trust.
To learn more, visit www.inq.consulting or contact us at ai@inq.consulting. To keep up with the latest tech and AI news, subscribe to the Think INQ newsletter.




