By Kyri Lefteri
Children’s law in England is a field defined by its contradictions: it is both deeply personal and profoundly systemic; both reactive and proactive. It is a space where the law intersects with the most vulnerable moments in human lives, and where practitioners must navigate the delicate balance between legal principles and the realities of childhood. As we look at the current landscape, it becomes clear that this area of law is not just evolving, it is being reshaped by the pressures of modern society. This article seeks to focus on the hidden dangers facing children today arising from online harm and how the current law in the UK is falling behind in the curve of societal developments.
The Legal Black Hole of Parental Oversharing
Child protection is the moral heart of Children’s law, but it is also its most ethically complex area. Practitioners must grapple with the tension between intervention and autonomy, between safeguarding children and respecting family dynamics and the individual choices and values of each parent and household. The rise of online dangers such as cyberbullying, grooming, and exploitation are new challenges to professionals and courts tasked with considering what amounts to significant harm and what is a diverse, but ultimately, reasonable, exercise of parental responsibility.
Many parents appear to have a lack of understanding of the dangers that surround online social media platforms. The modern propensity of parents to publish their children’s lives on public platforms, with children seemingly having no say in the matter, raises serious concerns about autonomy, privacy and the safety of the children involved. In an age where a child’s life can be documented from birth on social media, the legal system in the UK remains dangerously behind the curve. Despite existing protections under the UK GDPR and the Data Protection Act 2018, children still lack meaningful power to control their digital identities, especially when the harm comes not from strangers or corporations, but from their own parents.
The problem lies in a fundamental imbalance. Parents are both the legal guardians of a child’s rights and often the very people violating them. The act of “sharenting”, where parents share personal details, images, and milestones of their children online, is widely normalised, yet it can have lasting consequences. Impacts of “sharenting” on children range from the more widely understood, such as identity theft and bullying, to the more recently emerging malevolent creation of “deep fakes” and the sexual use of photographs by those intent to cause serious harm. The problem is that the law offers children minimal recourse and the age old adage of the ‘horse having already bolted’ could not be more apt.
Under UK GDPR, a child has a right to privacy, data protection, and erasure. Recital 38 UK GDPR states that “Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data. Such specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing or creating personality or user profiles and the collection of personal data with regard to children when using services offered directly to a child. The consent of the holder of parental responsibility should not be necessary in the context of preventive or counselling services offered directly to a child.”
Article 17 introduces a right for individuals, including children, to have personal data erased. This right is not absolute and only applies in certain circumstances. The Information Commissioner’s Office emphasises the right to erasure if the request relates to data collected from children, which reflects the enhanced protection of children’s information, especially in online environments under UK GDPR. The guidance states that “if you process data collected from children, you should give particular weight to any request for erasure if the processing of the data is based upon consent given by a child – especially any processing of their personal data on the internet. This is still the case when the data subject is no longer a child, because a child may not have been fully aware of the risks involved in the processing at the time of consent.”
But these rights are evidently filtered through the concept of “Gillick competence”, a vague and subjective test of whether a child is mature enough to exercise those rights independently. There appears to be no relief for the children who are not yet deemed competent to exercise those rights. Even when children are competent and they object to the content shared, enforcement is rare. Platforms often side with the uploader, and there is no clear legal mechanism that empowers children to force the removal of unwanted posts by their parents.
Moreover, the ICO’s guidance, while child-friendly in tone, lacks legal teeth when it comes to holding parents accountable for online harms they may cause. The UK has yet to establish a precedent that shows a real willingness to support children’s privacy claims against parents.
The Children’s Code, created by the Information Commissioner’s Office following the enactment of Section 123 of the Data Protection Act 2018, creates obligations for online services to design age-appropriate systems, Due to the Data (Use of Access) Act (DUAA) coming into force on 19 June 2025, the guidance is under review and according to the ICO website, there has been no public consultation on the proposed changes and the final version of the guidance is due for publication in Autumn of 2025. Section 81 DUAA adds a new duty under Article 25(1B) UK GDPR for developers and services to introduce more child focused risk assessments and forces organisations to not just “could children access this service” but “what design features, defaults, interfaces etc might expose them to risk, and how to guard against that.” The Code may see an update on guidance and instructions to reflect these new statutory duties.
These measures and developments are a step in the right direction, but they target tech platforms and developers, not individuals. They sidestep the growing issue of family-based data exposure, which often begins before a child can speak, let alone consent.
Risks of AI in Children Law
The growing use of artificial intelligence (AI) in legal processes introduces both opportunities and risks to children law. AI-powered tools, such as predictive analytics and automated decision-making systems, have the potential to streamline case management and improve efficiency. However, their use also raises significant concerns. For example:
- Bias and Fairness: When used in the court system, AI systems may inadvertently perpetuate biases present in the data they are trained on, leading to unfair outcomes in sensitive family disputes.
- Privacy and Data Security: The reliance on AI tools often involves processing large amounts of personal data, raising concerns about the security and confidentiality of sensitive information related to children and families within Children’s Law proceedings
- Human Oversight: While AI can assist in decision-making, it cannot replace the nuanced judgment and empathy required in children law cases. Over-reliance on AI risks undermining the human element essential to safeguarding children’s welfare.
- Recent Incident Highlighting AI Risks: A recent case involving a barrister who submitted suspected fake AI-generated case citations underscores the dangers of relying on AI without proper verification. The barrister faced significant professional consequences after the fabricated cases were exposed. This incident serves as a cautionary tale for legal professionals, highlighting the importance of human oversight and the need to verify AI-generated outputs. In child protection, such errors could have devastating consequences, potentially jeopardising the welfare of children and families.
- Litigants in Person: Unrepresented individuals are increasingly relying on AI-generated content that can be inaccurate or not tailored to their specific circumstances. This can lead to procedural mistakes and misinterpretation of the law by litigants in person. Perhaps most impactful is the potential to present irrelevant or even misleading arguments to the court. While AI can offer some support to litigants in person, its misuse may ultimately impact the court’s ability to make informed, balanced decisions.
Legal professionals must approach the integration of AI with caution, ensuring robust oversight and ethical guidelines to mitigate these risks while leveraging its benefits.
Opportunities for Innovation: A Call to Action
The challenges facing Children’s Law are not just legal; they are societal, ethical, and deeply human. The rise of online harm and the complexities of AI add new dimensions to an already intricate field. Practitioners must navigate these challenges with care, balancing the demands of the law with the realities of the lives it governs.
Children’s Law in England is at a crossroads, shaped by the pressures of modern society and the evolving needs of the families it serves. It is a field that demands not just legal expertise but empathy, creativity, and a commitment to justice. The ultimate goal is not just to resolve disputes but to create a system that truly supports the welfare of children.
It’s time to reconsider whether parental authority online should remain unchecked. As children increasingly come of age with a digital footprint they never agreed to, the legal system must evolve to reflect their rights not just as dependents, but as individuals. A robust legal framework should recognise a child’s autonomy over their own image and data, especially when the harm comes from home.
In the absence of such a legal framework, education for parents becomes key, which is also, sadly lacking. Parenting in today’s world seems to many like a constant fight to keep your head above water. It would likely be a relief to parents if the law could at least highlight these issues, to prompt parents to think about respecting children as individuals. Until then, children in the UK remain vulnerable to a lifetime of exposure, with limited tools to reclaim their digital selves.
This is not just a legal challenge; it is a societal one, requiring collaboration across disciplines and a willingness to confront difficult questions. As practitioners, we have the opportunity, and the responsibility to shape the future of children law. The question is not just “What can we do?” but “What must we do?” For in the end, the welfare of children is not just a legal issue; it is a moral imperative.
Related Posts
The Evolution of the Human Rights Claim in Part IV Proceedings
In the course of the past two years, claims under the Human Rights Act 1998 (‘HRA 1998’) have become a regular consideration for practitioners involved in section 31 Children Act proceedings. Pre 2015 there was very little in the way of domestic case law and as such the only guidance at the time was from Strasbourg.
Elissa Da Costa Waldman’s TLATA Case Hits The Headlines
Kirsty Cahill’s TLATA case hit the headlines last week as being the case of the mean property developer who humiliated her, the mother of his three children and partner of 20 years by refusing to accept that she was the true owner of the investment property purchased in her sole name.
Open Justice Principle – Where are the lines drawn in Care Proceedings?
Jemimah Hendrick led by Joy Brereton K.C. successfully appeared on behalf of the children's guardian appealing a decision made by Mrs Justice Lieven on 15 January 2025 that allowed disclosure from care proceedings, and reporting of information in relation to the subject children. This appeal was also supported by George Lafanzindes on behalf of the second carer.

