Home » News » AI Exec Limits Kids’ Tech: Screen Time & Parental Controls

AI Exec Limits Kids’ Tech: Screen Time & Parental Controls

Tech Executives Grapple with Children’s Digital Exposure

Jack Clark, head of policy at Anthropic, a leading artificial intelligence firm, says his professional life is increasingly consumed by the need for AI guardrails. But those concerns aren’t confined to the workplace. Clark is also actively implementing safeguards for his own children, limiting their exposure to technology and carefully considering the implications of algorithmic influence.

Speaking on “The Ezra Klein Show” podcast, Clark described a parenting approach rooted in a cautious view of technology’s role in childhood development. He acknowledged the growing ubiquity of digital devices, making it increasingly difficult for parents to shield their children completely. However, he believes a deliberate effort to manage screen time remains crucial.

“I have the classic Californian technology executive view of not having that much technology around for children,” Clark said, noting he recently returned from parental leave with a newborn. “I think finding a way to budget your child’s time with technology has always been the work of parents and will continue to be.”

Currently, Clark allows his toddler access to select programming, primarily the popular children’s show “Bluey,” on their smart TV. However, he has resisted the temptation to grant unrestricted access to platforms like YouTube. “It freaks me out,” he admitted, expressing concern about the potential for algorithmic rabbit holes and exposure to inappropriate content.

Clark’s approach is far from unique within the tech industry. A growing number of leaders who have built fortunes on digital innovation are expressing reservations about the impact of technology on their own families. In 2025, Miranda Kerr, model and entrepreneur, revealed that she and her husband, Snap CEO Evan Spiegel, prohibited their then-14-year-old son from having phones or computers in his bedroom after 9:30 pm. PayPal cofounder Peter Thiel, in 2024, stated he limits his children’s screen time to a mere 90 minutes per week. The concerns stretch back further; Apple cofounder Steve Jobs famously told the New York Times in 2010 that his children hadn’t used an iPad.

“We limit how much technology our kids use at home,” Jobs said at the time.

Clark’s own parenting philosophy is, in part, a reflection of his upbringing. He recalled his father allowing him limited access to a computer at his office, but intervening when screen time became excessive. “My dad would let me play on the computer, and at some point he’d say: Jack, you’ve had enough computers today. You’re getting weird,” Clark recounted with a hint of amusement.

Looking ahead, Clark believes the need for robust parental controls will only intensify as AI systems become more sophisticated, and accessible. He emphasized the importance of building these safeguards into AI platforms, recognizing that children will inevitably attempt to circumvent age restrictions. “So we’re going to need to build pretty heavy parental controls into this system,” he said. “We serve ages 18 and up today, but obviously, kids are smart, and they’re going to try to get onto this stuff.”

The challenge, Clark suggests, isn’t simply about restricting access, but about thoughtfully designing AI systems that prioritize child safety and well-being. This requires a proactive approach, anticipating the ways in which children might interact with these technologies and developing mechanisms to mitigate potential risks.

The debate over children and technology is not new, but the emergence of powerful AI systems adds a new layer of complexity. As these technologies continue to evolve, parents and policymakers alike will be grappling with the question of how to balance innovation with the need to protect the next generation from potential harms. The concerns voiced by Clark and other tech leaders underscore the growing recognition that simply building the technology isn’t enough; responsible development must include a deep consideration of its societal impact, particularly on vulnerable populations like children.

YouTube acknowledged the concerns surrounding children’s online safety and pointed to resources available on their website dedicated to age-appropriate experiences and parental controls.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.