CRC/C/GC/25
==
### COMMITTEE ON THE RIGHTS OF THE CHILD
# General Comment on children’s rights in relation to the digital environment
The Committee on the Rights of the Child is drafting a General Comment on children's rights in relation to the digital environment.
In March 2019, the Committee invited all interested parties to comment on the concept note of the general comment. Submissions on the concept note can be found here.
In August 2020, the Committee invited all interested parties to comment on its draft general comment. The Committee appreciates the 142 submissions it received from States, regional organisations, United Nations agencies, national human rights institutions and Children's Commissioners, children's and adolescent groups, civil society organisations, academics, the private sector, and other entities and individuals.
The Committee will take into account all submissions when deciding on the content of the final version of the general comment. All submissions are published [**HERE**](https://www.ohchr.org/EN/HRBodies/CRC/Pages/GCChildrensRightsRelationDigitalEnvironment.aspx).
---

[Twitter: Child Rights Connect](https://twitter.com/ChildRightsCnct)
FROM:[CRC/C/GC/25](https://tbinternet.ohchr.org/_layouts/15/treatybodyexternal/Download.aspx?symbolno=CRC/C/GC/25&Lang=en)
## I. Introduction
Children in diverse contexts see digital technology as critical to their current lives, and the future. They find benefits of using digital technology: “I express [online] what I see as important through my eyes in the world”; “I need technology for school and for fun”; “When you are sad, the internet can help you see something that brings you joy”. Children also call for action to support, promote and protect their safe engagement with these technologies: “I would like the government, technology companies and teachers to help us manage untrustworthy information online”; “I want [my parents] to ask permission before they upload a photo of me”; “I want to know more about what specific companies are using my data”.1
The digital environment is continually evolving and expanding. It includes the internet and mobile technologies; digital networks, content, services and applications; old and new systems of media, communication and information; connected devices and environments; virtual and augmented reality; artificial intelligence, including machine learning; robotics; automated systems and data analytics; and biometrics and biotechnology.
Children’s rights shall be respected, protected and fulfilled in the digital environment. Innovations in digital technologies impact children’s lives and their rights in ways that are wide-ranging, cumulative and interdependent. Meaningful access to digital technologies can support children to realise the full range of their civil, political, cultural, economic, social, cultural and environmental political and social rights. However, digital inequalities between children within one State, and between children living in different States, can have a detrimental impact on children’s enjoyment of their rights. If digital inclusion is not improved, already existing inequalities are likely to be exacerbated.
The opportunities and the risks of harm for children are likely to increase, even where children do not actively access the internet, as societies progressively rely upon digital technologies for their functioning.
The digital environment is becoming increasingly important in many aspects of children’s lives as part of normal life and during times of crisis. Yet its short and long-term impacts on children’s well-being and their rights are uncertain. Therefore, it is important to ensure that children benefit from engaging with the digital environment and mitigate the associated harms, including for children in disadvantaged or vulnerable situations.
The general comment draws on the Committee’s experience of reviewing State parties’ reports, its 2014 Day of General Discussion on digital media and children’s rights, the jurisprudence of human rights treaty bodies, Special Rapporteurs, Human Rights Council, international consultations with experts and stakeholders, and the participation of children through a cross-national consultation. The children’s consultation comprised participatory workshops with 709 children living in urban or rural areas in 28 countries in different regions. It included children who are from minority groups; children with disabilities; migrant or refugee children; children in street situations; children in child justice; children from low-socioeconomic communities; and children in other disadvantaged or vulnerable situations.
The present general comment has been developed to provide an overview of how the Convention on the Rights of the Child in its entirety needs to be understood and implemented in relation to the digital environment. It should be read in conjunction with other relevant general comments issued by the Committee and its guidelines regarding the implementation of the Optional Protocol to the Convention on the sale of children, child prostitution and child pornography.
## II. Objective
The objective of the general comment is to provide guidance on legislative, policy and other appropriate measures to ensure full compliance with the obligations under the Convention and its Optional Protocols in the light of the opportunities, risks, and challenges for children’s rights in the digital environment.
III. General principles
The following four principles provide a lens through which implementation of all other rights under the Convention should be viewed. These act as a guide for determining the measures needed to guarantee the realization of children’s rights in relation to the digital environment.
### A. The right to non-discrimination (art. 2)
The right to non-discrimination requires that States ensure all children, including children of lower income families and children living in rural and remote areas, have equal and effective access to the digital environment in ways that are meaningful for them.2 States should take all necessary measures to lower the cost of connectivity, provide free access to children in safe dedicated public spaces, and invest in policies and programmes that support all children’s use of digital technologies at school, home, and in their community, to overcome inequalities and improve digital inclusion.
Children report many forms of discrimination in relation to the digital environment, including through actions that result in exclusion from digital technologies and services, or hateful communication or discriminatory treatment. Children may be unaware of other forms of discrimination, including those that may result from the deployment of automated decision-making based on protected, biased, partial or unfairly obtained information.
Specific groups of children may require particular measures to prevent discrimination on the grounds of sex, disability, socioeconomic background, ethnic or national origin, or any other ground. This includes minority and indigenous children, asylum-seeking, refugee and migrant children, LGBTI children, child victims of sexual exploitation, children in poverty and children in alternative care, including institutions, and children from other vulnerable situations.. This is because, for such groups, the digital environment may both provide unique access to vital resources, and also it may present heightened risks.
### B. The best interests of the child (art. 3, para. 1)
The best interests of the child is a dynamic concept that requires an assessment appropriate to the specific context.3 Although the digital environment was not originally designed for children, they occupy the digital space along with adults. Therefore, this principle has a special importance in relation to the digital environment. States shall ensure that in all decision-making regarding the provision, regulation, design and management of the digital environment that may impact children’s rights, the best interests of the child shall be a primary consideration.
When making decisions relating to the regulation of the digital environment, States shall consider the nature, scale and prevalence of potential harms and violations of children’s rights in such environments, contrasted with assumed interests and rights of others, and shall apply the best interests of the child as the determining principle. States shall ensure transparency in assessment of the best interests of the child and demonstrate what criteria have been applied.
### C. Right to life, survival and development (art. 6)
Online experiences and opportunities provided by the digital environment are of crucial importance for children’s development, and may be vital for children’s life and survival, especially in situations of emergency.
States shall take all appropriate measures to protect children from the risk and threat to their right to life, survival and development in the digital environment. These include content, contact and conduct risks, and threats that include bullying, gambling, sexual exploitation and abuse, persuasion relating to suicide and other life-threatening activities including by criminals, armed groups and those designated as terrorist groups. States should identify and address emerging risks children face in diverse contexts by consulting them as children have an important insight into the particular and emerging risks they face.
States should pay specific attention to the earliest years of life, when brain plasticity is maximal and the social environment, particularly the relationships with parents and caregivers, is crucial in shaping the child’s cognitive, emotional and social attitudes and skills. Moreover, although there is insufficient evidence that early use of digital devices may increase the risk of later digital addiction, a precautionary approach should be taken also in this respect. Uses of digital technologies may help or hinder children’s development, depending on their design, purpose and use.
Since direct social relationships play a crucial role in shaping the child's cognitive, emotional, and social attitudes and abilities, the use of digital devices should not substitute for direct, responsive interactions amongst children themselves or between children and their parents and caregivers, such as talking, reading and playing. When determining the appropriate use of digital devices, and advising parents, caregivers, educators and other relevant actors, States should take into account research on the effects of digital technologies on children’s development, especially during the critical neurological growth spurts of early childhood and adolescence.4
### D. The right to be heard (art. 12)
Children report that the digital environment affords them crucial opportunities for their voice to be heard.5 The use of digital technologies can enhance children’s right to be heard in matters that affect them and help to realize children’s participation at local, national and international levels.6 States should offer training and support to children, and provide access to child-friendly platforms, in order to let them express their views and become effective advocates for their rights. While States are encouraged to utilise the digital environment to consult with children on relevant legislative and policy developments, they should ensure that children’s participation does not result in undue monitoring or data collection that violates their right to privacy. States should also ensure that consultative processes are inclusive of children who lack access to technology.
When developing laws, policies, programmes, services and training on children’s rights in relation to the digital environment, States should involve children, especially children in disadvantaged or vulnerable situations, and victims of harm related to the digital environment, listen to their needs and give due weight to their views. States should ensure that designers and providers of digital technologies and services actively engage children, applying appropriate safeguards, and give their views due consideration when developing their services.
## IV. Evolving capacities (art. 5)
States shall respect the evolving capacities of the child as an enabling principle that addresses the process of their gradual acquisition of competencies, understanding and agency.7 This process has particular significance in the digital environment where children can engage more independently of parent and caregiver supervision.
The policies adopted to implement children’s rights in the digital environment need to vary according to children’s evolving capacities in order to reflect an appropriate balance between protection and emerging autonomy.8 In designing these policies, and the frameworks within which children engage with the digital environment from early childhood to adolescence, States shall consider: the changing position of children and their agency in the modern world; children’s competence and understanding that develop unevenly across areas of skill and activity; the nature of the risks involved in balance with the importance of taking risks in supported environments in order to develop resilience; and individual experience, capacity and circumstances.9 States should require digital providers to offer or make available services to children appropriate for their evolving capacities.
In accordance with the States’ duty to render appropriate assistance to parents and caregivers in the performance of their child rearing responsibilities, States should promote the awareness of parents and caregivers to respect children’s evolving autonomy and capacities and need for privacy. They should inform and support parents and caregivers in acquiring digital technology skills to help them to assist children in relation to the digital environment.
## V. General measures of implementation by States (art. 4)
Opportunities for the realization of children’s rights and their protection in the digital environment require a broad range of legislative, administrative and other measures, including precautionary ones. In the development of policies and practices that affect children’s rights regarding the digital environment, States should consult with children, their parents and caregivers.
### A. Legislation
States should review and update national legislation to ensure the digital environment is compatible with the rights in the Convention and its Optional Protocols and that it remains relevant in the context of technological advances and emerging practices. States should mandate the use of child rights impact assessments to inform the development of legislation.10
### B. Comprehensive policy and strategy
States should ensure that national policies and/or strategies for children’s rights, and regarding the development of the digital environment, as well as any corresponding action plans, address children’s rights issues related to the digital environment and that they are regularly updated.
In addition to regulation, industry codes and design standards, such action plans should establish and promote, inter alia, training and guidance for parents and caregivers, relevant professionals and the public, programmes to develop children’s digital skills and access to opportunities. Such measures should protect children, including from online sexual abuse and exploitation, and provide remedy and support for child victims and measures to meet the needs of children in disadvantaged or vulnerable situations, including resource materials translated into relevant minority languages.
States should ensure the operation of effective child online protection and safeguarding policies in settings where children access the digital environment, including pre-schools, schools, cybercafés, youth centres, alternative care settings and institutions where children live.
### C. Coordination
States should identify a government body that is mandated to coordinate policies and programmes related to children’s rights in the digital environment among central government departments and different levels of government.11 It should also cooperate with businesses, civil society and other organizations to realize children’s rights in relation to the digital environment at cross-sectoral, national, regional and local levels.12 Such a body should be able to draw on technological and other relevant expertise within and beyond government as needed. It should be independently evaluated for its effectiveness in meeting its obligations.
### D. Allocation of resources
States should mobilize, allocate and utilize public resources to fully implement legislation, policies and programmes to realize children’s rights in a digital environment and improve digital inclusion to reflect the increasing impact of the digital environment on children’s lives and to promote equality of access and affordability of services and connectivity.13
Where resources are contributed from the private sector or obtained through international cooperation, States should ensure that their own revenue mobilization, budget allocation and expenditure is not interfered with or undermined by third parties.14
### E. Data collection and research
Data collection and research are vitally important as a means of mapping and understanding the implications of the digital environment for children’s rights, and for evaluating its impact on children, and the effectiveness of State interventions. States should ensure the production of robust, comprehensive data that is adequately resourced. Such data and research, including research conducted with and by children, should inform regulation, policy and practice and should be in the public domain.15
### F. Independent monitoring
States should ensure that the mandates of national human rights institutions, or other appropriate independent institutions, also cover children’s rights in the digital environment, and are able to receive, investigate and address complaints from children and their representatives.16 Where independent oversight bodies to monitor the activities in relation to the digital environment exist, national human rights institutions should work closely with such bodies to effectively discharge their mandate regarding children’s rights.17
### G. Dissemination of information, awareness-raising and training
States should disseminate information and conduct awareness raising campaigns on the rights of the child in the digital environment. States should facilitate educational programs for children, parents and caregivers, as well as the general public and policy makers, to enhance their knowledge of children’s rights and develop their digital literacy and skills. This should include how children can benefit from digital services, how to minimize risks and how to recognize a child victim of online harm and respond appropriately.
Professionals working for and with children in all settings, including in health and mental health facilities, in social work, alternative care institutions, law enforcement, the justice system as a whole, and the business sector including the technology industry, should receive training that includes how the digital environment impacts the rights of the child in the multiple contexts and ways in which children access and use technologies. States should ensure that pre-service and in-service training relating to the digital environment is provided for educators working in nurseries, schools and other learning settings.
### H. Cooperation with civil society
States should systematically involve civil society, including non-governmental organizations working both in the field of children’s rights and in the field of the digital environment, in the development, implementation, monitoring and evaluation of laws, policies, plans and programmes related to children’s rights and ensure that civil society organizations are able to implement their activities related to the promotion and protection of the rights of children.
### I. The business sector
The business sector including business enterprises as well as not-for-profit organizations directly and indirectly impacts on children’s rights through their activities and operations in providing services and products related to the digital environment. States have obligations to ensure that the business sector meets its responsibilities for children’s rights in relation to the digital environment by taking all necessary measures including adoption of legislation and regulations, and the development, monitoring and enforcement of policy. Children are calling for businesses to better respect, protect and remedy their rights in relation to the digital environment.18
States should require businesses to prevent their networks or online services from being misused for purposes that threaten children’s safety and well-being, and to provide parents, caregivers and children with timely safety advice and prompt and effective remedy.
States should require business enterprises to undertake child-rights due diligence, in particular to carry out and disclose to the public child-rights impact assessments, with special consideration for the differentiated and, at times, severe impact of the digital environment on children.19 States should take appropriate steps to prevent, monitor, and investigate child rights violations by businesses in the digital environment.
In addition to developing legislation, States should require businesses that impact on children’s rights in relation to the digital environment to establish and implement regulatory frameworks, industry codes and terms of services that adhere to the highest standards of ethics, privacy and safety into the design, engineering, development, operation, distribution and marketing of their technological products and services. States should also require businesses to maintain high standards of transparency and accountability, and encourage them to take measures to innovate in the best interests of children.
### J. Commercial advertising and marketing
The digital environment includes businesses that rely financially on processing personal data to target advertising, marketing, and revenue-generating or paid content; this intentionally and unintentionally impacts on the digital experience of children. Many of these processes involve multiple commercial partners, creating a supply chain of commercial activity and processing of personal data that may result in violations of children’s rights, for example by including in the methods of advertising design features that anticipate and guide a child’s action towards more, or more extreme content, or automated notifications that can interrupt sleep, or use a child’s personal information or location to target advertisements or other commercially driven content.
States should ensure that advertising and marketing are age appropriate and all forms of commercially driven content are clearly distinguished from other content.
States should prohibit by law the targeting of children of any age for commercial purposes on the basis of a digital record of their actual or inferred characteristics. Neuromarketing of child-directed products, applications and services should also be prohibited.
Where parental consent is required to process children’s personal data, States should require that efforts are made to verify that consent is informed, meaningful and given by the actual parent or caregiver of the child.
### K. Remedies
Children whose rights relating to the digital environment have been violated face particular challenges in accessing justice due to difficulties in obtaining evidence and identifying perpetrators, or because they lack knowledge of their rights or what constitutes a violation of their rights in the digital environment. Further challenges may arise due to the sensitivity of the issues that include disclosing children’s online activities and fears of reprisals by peers or of social exclusion.
States should ensure that appropriate and effective remedial judicial and non-judicial mechanisms for the violations of children’s rights relating to the digital environment are prompt, available and accessible to children and their representatives. Such mechanisms should include free, widely-known, safe, confidential and child-friendly complaint and reporting mechanisms to the relevant authorities. States should also provide for collective complaints, including class action and public interest litigation. States should provide for legal or other appropriate assistance to children whose rights have been violated through the digital environment.
States should establish, coordinate, and on a regular basis monitor and evaluate the framework for the referral of cases and provision of effective support to child victims.20 This includes measures for the identification, therapy and follow-up care and social reintegration of child victims. Measures within this framework should be multi-agency and child-friendly to prevent the child’s re-victimization and secondary victimization in investigative and judicial processes.
Appropriate reparation includes restitution, compensation and satisfaction, and may require apology, correction, removal of unlawful content or other measures.21 Remedial mechanisms should take into account the particular vulnerability of children to the possible irreversible effects and lifelong damage of violations of their rights. Reparation should be timely to limit ongoing and future damage. States should guarantee non-recurrence of violations, including by reform of relevant law and policy and their effective implementation.
States should provide specialized training for law enforcement officials, lawyers, prosecutors and judges that would specifically include issues related to digital environment to ensure effective implementation and enforcement of law regarding such violations.
Children can face particular difficulties in obtaining remedy when their rights have been violated in the digital environment by business enterprises, in particular in the context of their global operations.22 States should consider measures to allow for extra-territorial jurisdiction, when there is a reasonable link between the State and the conduct concerned. States should ensure that businesses provide effective complaints mechanisms; this should not, however, prevent children from accessing State-based remedies. State agencies with oversight powers relevant to children’s rights such as health and safety, consumer rights, education, advertising and marketing shall monitor and investigate complaints and provide and enforce remedies for violations of children’s rights in the digital environment.23
States should provide children with child-sensitive and age-appropriate information in child’s own language on their rights, on reporting and complaint mechanisms in place, as well as the services and remedies available, when their rights are violated in relation to the digital environment. This information should be provided to parents, caregivers, educators and people working with and for children.
## VI. Civil rights and freedoms
### A. Access to information (arts. 13 and 17)
The digital environment promises a unique opportunity for children to realize the right to access information, and to create and distribute their own content. In this regard, mass media perform an important function, which includes digital and online media content.24
States should provide and support the creation of child-friendly, age-appropriate digital content for children in accordance with their evolving capacities, and ensure that children are able to access a wide diversity of information, including information about culture, sports, arts, health, civil and political affairs, and children’s rights, from a plurality of media and other sources including information held by public bodies. This ability to access relevant information can have a significant positive impact on equality.25
States should encourage the production and dissemination of such content from a plurality of national and international sources, such as social media platforms, online publications, podcasts, video-streaming, etc., as well as broadcasters, museums, libraries and other relevant organizations. States should make particular efforts to enhance the provision, from early childhood, of diverse, accessible and beneficial content for children with disabilities and children from ethnic, linguistic, indigenous and other minorities.
States should ensure that children are informed about and can easily find diverse and good quality information online, including content independent of commercial or political interests. States should ensure that automated search and recommendation systems do not prioritize paid content that has, for example, an online influencer with a commercial or political motivation, at the cost of children’s right to information.
The digital environment can include biased, gender-stereotyped, discriminatory, racist, hateful, violent, pornographic and exploitative information, as well as false narratives, misinformation and disinformation, for example false health cures or false narratives about a faith community, and information encouraging children to engage in unlawful or harmful activities, including by terrorist armed groups. States should require businesses and other providers of digital content to develop and implement guidelines to enable children to safely access a diversity of content while protecting them from such harmful material in accordance with their evolving capacities, and recognizing children’s right to information and freedom of expression.26 Any restrictions on the operation of any internet-based, electronic or other information dissemination systems, are only permissible to the extent that they are compatible with Article 13.27
States should encourage providers of digital services used by children to apply concise and intelligible content labelling, for example on age-appropriateness, and provide user-friendly guidance and educational materials for children, parents and caregivers, educators and relevant professional groups.28 Age/content-based systems designed to protect children from age-inappropriate content should be consistent with the principle of data minimization.
States should ensure that digital providers comply with relevant guidelines, standards and codes,29 enforce their own community content rules and provide sufficient content moderation to meet their published terms. The content controls, including parental control tools and school filtering systems, restrictions on the operation of any internet-based, electronic or other information dissemination systems should not be used to restrict children’s access to the digital environment, but only to prevent the flow of harmful material to children. Such controls should balance protection against children’s rights, notably their rights to freedom of expression and privacy.
States should encourage media and other relevant organizations to provide reliable information to parents and children about the nature of digital services and the associated opportunities and risks. Professional codes of conduct set by journalists should provide guidance on reporting digital risks and opportunities relating to children in a proportionate and evidence-based manner.
### B. Freedom of expression (art. 13)
A child’s right to freedom of expression includes freedom to seek, receive and impart information and ideas of all kinds, using any media of their choice. Children report30 that these technologies offer significant scope to express their ideas, opinion, and political views. For children in disadvantaged or vulnerable situations, online participation with others who share their experiences can help them to express themselves.
Any restrictions on children’s right to freedom of expression in the digital environment, such as filters and other barriers including safety measures, shall be provided by law, necessary and proportionate. States should provide children with information on how to effectively exercise this right, particularly how to create and share digital content, while respecting the rights and dignity of others and not violating legal rules, such as those related to incitement to hatred and violence.
When children express their political or other views and identities in the digital environment, this may attract criticism, hostility, threats or punishment. States should protect children from online harassment and threats, censorship, data breaches and digital surveillance. Children should not be prosecuted for expressing their opinions in the digital environment.
Given the existence of commercial and political motivations to promote particular worldviews in the digital environment, States should ensure that uses of automated decision-making do not supplant, manipulate or interfere with children’s ability to form and express their opinions in the digital environment.
### C. Freedom of thought, conscience and religion (art. 14)
States shall respect the right of the child to freedom of thought, conscience and religion in the digital environment. Automated systems are sometimes used to make inferences about a child’s inner state, in education, health, criminal justice or commercial contexts, among others. States shall ensure that automated systems are not used to impact or to influence children’s behaviour or emotions.
The Committee encourages States to introduce or update data protection regulation and design standards that identify, define and prohibit practices which manipulate or interfere with the child’s right to freedom of thought and beliefs, for example by emotional profiling, in the digital environment.
States should ensure that children are not penalized for their religion or beliefs or have their future opportunities in any other way restricted. The exercise of the children’s right to manifest their religion or beliefs in the digital environment may be subject only to limitations that are provided by law, are necessary and proportionate.
### D. Freedom of association and peaceful assembly (art. 15)
The right to freedom of association and peaceful assembly enables children to form their social, religious, sexual and political identities, and to participate in associated communities as well as in public spaces for deliberation, cultural exchange, social cohesion and diversity.31 Children report that online spaces provide them with valued opportunities to meet and deliberate with peers, decision-makers and others who share their interests.32
States should ensure that their laws, regulations and policies protect children’s right to participate in social, civic, political, religious, environmental and cultural organizations that operate partially or exclusively in the digital environment. No restrictions may be placed on the exercise of children’s right to freedom of association and peaceful assembly in the digital environment other than those that are provided by law and are necessary and proportionate.33 States should ensure that children’s participation in associations or assemblies in the digital environment does not result in negative consequences to those children, such as exclusion from a school, deprivation of a scholarship or police profiling.
Public visibility and networking opportunities in the digital environment can also support forms of child-led activism and empower children as advocates for their rights and the rights of others. The Committee recognises that the digital environment enables child human rights defenders, vulnerable children, including children with disabilities, children in street situations or from indigenous/minority communities or disadvantaged groups to advocate for their rights, to communicate with each other and form associations. States should support them and ensure their safety.
### E. Right to privacy (art. 16)
Privacy is vital for children’s agency, dignity and safety, and for the exercise of their rights. Threats to children’s privacy may arise from their own activities in the digital environment, 34 as well as from the activities of others, for example by parents’ sharing online the photos or other information of their children, or by caregivers, other family members, peers, educators or strangers. Threats to children’s privacy may also arise from data collection and processing by public institutions, businesses and other organizations; as well as from criminal activities such as hacking and identity theft.
Digital technologies are used to collect data about, inter alia, children’s identities, activities, location, communication, preferences and relationships. Children’s personal data are often processed to offer educational, health and other benefits to children. Certain combinations of personal data, including biometric data can be used to uniquely identify a child. Digital practices such as automated data processing, behavioural targeting, mandatory identity verification, and mass surveillance are becoming routine. Such practices may lead to arbitrary or unlawful interference with children’s right to privacy; they are rarely transparent to children or their parents or caregivers, and may have adverse consequences on children, which may extend to later stages of their lives. Children are concerned about their privacy and want to better understand how their data is collected and used.
Interference with a child’s privacy is only permissible if it is neither arbitrary nor unlawful (article 16). This means any such interference must be provided for by law, be aimed at achieving a legitimate purpose, be proportional and not in conflict with the provisions, aims and objectives of the Convention.
States shall take legislative and other measures to ensure that children’s privacy is respected and protected by all organizations and in all environments that process their data. Such legislation should include strong safeguards, independent oversight and access to remedy. States should encourage the adoption of privacy-by-design, such as end to end encryption, in services that impact on children. States should regularly review such legislation and ensure that procedures and practices prevent deliberate infringements or accidental breaches of children’s privacy. States should ensure that consent to process a child’s data is informed and freely given by the child or, depending on the child’s age and maturity, by the parent or caregiver, and obtained prior to the processing.
States should ensure that children, their parents or caregivers have access to the data stored, to rectify data that is inaccurate or outdated and to delete or rectify data unlawfully or unnecessarily stored by public authorities or private individuals or bodies.35 States should further ensure the right of children to withdraw their consent and object to personal data processing, at least in cases where the data controller does not demonstrate legitimate, overriding grounds for the processing.
The data processed should only be accessible to the authority and individuals designated under the law to receive, process and use it in compliance with due process guarantees, and on a case-by-case basis.36 Children’s data gathered for defined purposes, in any setting, shall be protected and exclusive to those purposes and have a clearly defined period of retention. Where information is provided in one setting and can legitimately benefit the child by use in another setting, for example, school and tertiary education, that use must be transparent, accountable, and subject to the consent of the child, parent or caregiver, as appropriate.
Privacy and data protection legislation and measures should not arbitrarily limit children’s other rights, for example their right to freedom of expression or protection rights. States should ensure that data protection legislation respects children’s privacy and personal data in relation to digital environments. Through continual technological innovation, the scope of the digital environment is expanding to include settings and objects, such as clothes and toys that traditionally were not digital. The more environments where children spend time become ‘connected’ (through the use of embedded sensors that are connected to automated systems), the more important it is that States ensure that organizations, devices and services that constitute such environments are subject to robust data protection and other privacy regulations and standards. This includes public settings such as streets, schools, libraries, sports and entertainment venues, business premises including shops and cinemas, and the home or other settings where children live.
The digital surveillance of children may result in the constant scrutiny of children while online or offline, for example in educational and care settings. Any surveillance of children together with any associated automated processing of personal data, shall respect the child’s right to privacy and shall not be conducted routinely, indiscriminately, or without the child’s knowledge, or in the case of very young children their parent or caregiver, and where possible the right to object to such surveillance.
The digital environment presents particular problems for parents and caregivers in respecting child’s right to privacy. Technologies that monitor online activities for safety purposes may prevent a child accessing a helpline or searching for sensitive information. States should advise parents and caregivers, and the public, on the importance of the child’s right to privacy, and on how their own practices may threaten that privacy, for example by building a digital identity for that child that may be used by third parties in ways that can be revealing, embarrassing or even dangerous. They should also be advised that any monitoring of the child’s internet use should be proportionate and in accordance with the child’s evolving capacities, and about practices by which to respect and protect children’s privacy in relation to the digital environment.
Many children use online avatars or names that protect their identity, and such practices can be important to protect children’s privacy. States should take a safety-by-design approach to anonymity, to ensuring that anonymous practices are not routinely used to hide harmful or illegal behaviour, for example bullying or hate speech. Safety-by-design might include encouraging platforms to forbid such behaviours in their published terms and block users who fail to uphold their standards. Protecting a child’s privacy in the digital environment may be vital in circumstances when the parents or caregivers themselves pose a threat to the child’s safety or, for example, when they are in conflict over the child’s care (e.g. custody or access). Such cases may require interventions such as family counselling or other services to safeguard the child’s right to privacy.
Providers of preventive or counselling services to children in the digital environment should be exempt from any requirement for a child user to obtain parental consent in order to access such services.37
### F. Birth registration and right to identity (arts. 7 and 8)
The right of the child to birth registration can be enhanced through digital birth registration systems and States should promote the use of such systems. Lack of birth registration facilitates the violations of children’s rights under the Convention and its Optional Protocols. States should use modern technology to ensure access to online birth registration. This should be made available also to children born prior to the introduction of online registration. To guarantee birth registration to children in remote areas, refugee and migrant children, children at risk and those in marginalized situations, States should use online mobile registration units. States should provide awareness-raising campaigns, establish monitoring mechanisms, promote community engagement, and ensure effective coordination between civil status officers, notaries, health officials and child protection agency. For such systems to benefit children’s realization of their rights, States should also ensure that they do not hinder children’s access to basic services nor violate children’s privacy and identity. The integration of birth registration with the digital identity systems will facilitate a child’s access to services, including health, education and protection.
States shall respect the right of every child to preserve his or her identity, in particular, refugee children and children in situations of migration, and act to prevent or resolve statelessness. It is the duty of the States parties to provide appropriate assistance and protection with regard to re-establishing the child’s identity, where the child is illegally deprived of some or all of the elements of his or her identity.
## VII. Violence against children (arts. 19, 24 (3), 28 (2), 34, 37 (a) and 39; OPSC; OPAC)
As digital technologies continue to expand their role in the lives of children, States should regularly update and enforce legislative, regulatory and institutional frameworks that protect children from recognized and emerging risks of violence, including psychological harm, in the digital environment. The States should implement safety and protective measures in accordance with the children’s evolving capacities. However, the States shall also take legislative and regulatory measures to prevent risks of harm that children may face.
The digital environment opens up new ways for sexual offenders to solicit children for sexual purposes, participate in online child sexual abuse via live video streaming, distribute child sexual abuse material, and commit the sexual extortion of children.
Digital technologies also bring additional complexity to the investigation and prosecution of crimes against children, who may cross national borders. States should address the ways in which uses of digital technologies may facilitate, or impede the investigation and prosecution of diverse forms of physical or mental violence, injury or abuse, neglect or negligent treatment, maltreatment or exploitation, including sexual abuse, child trafficking and gender-based violence.
Forms of digitally mediated violence and sexual exploitation may be perpetrated within the child’s circle of trust, for instance by family and friends or, for adolescents, by intimate partners. Some risks of harm in the digital environment are perpetrated by children themselves, not necessarily with the child’s full understanding of the harm that can result. These may include cyberbullying, harassment, violence, and sharing of sexualized images of children (“sexting”), and the promotion of self-harming behaviours such as cutting, suicidal behaviour or eating disorders. Where children have carried out or instigated such actions, States should pursue preventive, safeguarding and restorative justice approaches whenever possible.38
The digital environment opens up new ways for non-state groups, including armed groups and those designated as terrorist groups to recruit and exploit children to engage with or participate in violence. States should ensure that counter-terrorism legislation prohibits the recruitment of children by terrorist groups, and that child offenders are treated as victims or, if tried, in accordance with child justice systems.39
States should ensure that business enterprises meet their responsibility to effectively protect children from all forms of violence including cyber-bullying, cyber-grooming, sexual exploitation and abuse in the digital environment. Although businesses may not be directly involved in such harmful acts, they can be complicit in these violations of children’s right to freedom from violence. States should develop regulatory approaches to encourage and enforce the ways businesses meet these responsibilities, taking all reasonable and proportionate technical and procedural steps to combat criminal and harmful behaviour directed at children in relation to the digital environment.40
States should provide children with accessible, child-friendly and confidential online reporting and complaint mechanisms.
## VIII. Family environment and alternative care (arts. 5, 9, 18, 20)
Many parents and caregivers require support to build technological understanding, capacity and skills to assist children in relation to the digital environment. States should ensure that parents and caregivers have opportunities to gain digital literacy to learn how technology can support the rights of children and to recognize a child victim of online harm and respond appropriately.
Guidance on the digital environment should be informed by consultation with parents, caregivers and children, provided in the languages they understand, and widely disseminated. Special attention should be paid to parents and caregivers of children in disadvantaged or vulnerable situations.
In supporting and guiding parents and caregivers regarding the digital environment, States should promote their awareness to respect children’s growing autonomy and need for privacy, in accordance with their evolving capacities. States should take into account that children often embrace and experiment with digital opportunities, and may encounter risks, particularly at a younger age than parents and caregivers anticipate. Some children report wanting more support and encouragement in their digital activities, especially where they perceive parents’ and caregivers’ approach to be highly restrictive and not adjusted to their evolving capacities.41
States should take into account that support and guidance provided to parents and caregivers should be based on an understanding of the specificity and uniqueness of parents-child relations. Such guidance should support parents sustain an appropriate balance between the child’s protection and emerging autonomy, and prioritise positive parenting over prohibition or control. To help parents and caregivers to maintain a balance between parental responsibilities and children’s rights, the best interests of the child applied together with the child’s evolving capacities should be the guiding principles.
Guidance to parents and caregivers should encourage children’s social, creative and learning activities in the digital environment. It should also explain that use of digital technologies cannot replace direct, responsive interactions amongst children themselves or between children and parents or caregivers.
States should ensure that children separated from their families, such as children in alternative care, migrant or refugee children, or children in street situations, have access to digital technologies including for the purpose of maintaining family relationships, when appropriate.42 States should ensure that children’s privacy in their online activities is respected. Digital technologies may also be beneficial in establishing relations between a child and prospective adoptive or foster parents, or reuniting children in humanitarian situations with their families. Therefore, in the context of all separated families, States should support the digital access of children and their parents, caregivers or other relevant persons.
Measures taken to enhance digital access should be balanced against the need to protect children in cases where parents or other family members, or caregivers, whether physically present or distant, may place them at risk. Such risks may be enabled through the design and use of digital technologies, for example by unintentionally revealing the location of the child to a potential abuser. States should ensure that parents and caregivers are fully conversant with the risks and aware of strategies to support and protect children.
## IX. Children with disabilities (art. 23)
The digital environment opens new avenues for children with disabilities to engage in social relationships with their peers, access information, and participate in public decision-making processes. States should pursue these new avenues and also take steps to overcome barriers faced by children in relation to the digital environment.
Barriers that children with disabilities face in the digital environment relate to insufficient access to assistive technologies at home, school, or in relation to culture, play and communication with peers. Children with disabilities can also encounter policies that have a discriminatory impact on them, such as a ban on their use of digital phones in some settings, although children with disabilities may heavily rely on digital devices to communicate and access information. Further, many websites, applications, games and other digital services fail to meet universal design requirements to ensure accessibility.
States should ensure access to a wide range of affordable assistive technologies and to physical and virtual engagement where needed, in particular for children with disabilities living in poverty. States should provide guidance and resources to staff in schools and other relevant settings so that they have sufficient training to support children in utilizing appropriate digital technology. They should also ensure that technologies are designed for universal accessibility so that they can be used by all children without exception.
Children with disabilities should be involved in the design and delivery of policies, products and services that impact on the realisation of their rights in the digital environment. States should ensure that technologies are designed for universal accessibility so that they can be used by children with disabilities without the need for adaptation, and promote technological innovation that meets their requirements.
Children with disabilities can be more exposed to online risks, including bullying in the digital environment. States should identify and address the safety risks faced by children with disabilities, taking steps to ensure that the digital environment is safe for them. Safety information, protective strategies, public information, services and forums relating to the digital environment forums should be provided in accessible formats.
## X. Basic health and welfare (art. 24)
Digital technology can significantly facilitate access to health services and information, and improve the diagnostic and treatment services for maternal, new-born, child and adolescent physical and mental health and nutrition. It also offers significant opportunities to reduce inequalities to access health services and reach children in disadvantaged or vulnerable situations or in remote communities. At times of, for instance, public emergency or humanitarian situations, access to health services and information through digital technology may become the only option.
Children report43 that they value searching online for information and support relating to health and well-being, and about physical, mental or sexual and reproductive health, including as regards puberty, sexuality and conception. Adolescents especially want access to free, confidential, age-appropriate and non-discriminatory mental health and sexual and reproductive health services online.44 States should ensure that children have safe, secure and confidential access to trustworthy health information and services, including psychological counselling services. 45 These services relating to the digital environment should be provided by professionals or those with appropriate training, and regulated oversight mechanisms should be in place.
States should encourage and invest in research and development that focus on children’s specific health needs and promote positive health outcomes for children through technological advances. Digital services should be used to supplement or improve in-person provision of health services to children.46 States should introduce or update regulation that requires providers of health technologies to embed children’s rights in their functionality, content and distribution.
States should regulate against known harms and proactively consider emerging research and public health evidence to prevent the spread of misinformation that may harm children, materials damaging to children’s mental or physical health, and services that undermine children’s development, for example through persuasive design, excessive gaming or age-inappropriate features.47
States should encourage the use of digital technologies to promote healthy lifestyles, including physical and social activity.48 States should regulate targeted or age-inappropriate advertising, marketing or service designed to prevent children’s exposure to the promotion of unhealthy food and beverages, alcohol, drugs, tobacco and other nicotine products.49 Such regulations relating to the digital environment should be compatible with and keep pace with regulation in the offline environment.
Digital technologies offer multiple opportunities to children to improve their health and well-being, when balanced with their need for rest, exercise and direct interaction with their peers, families and communities. States should develop guidance for children, parents, caregivers and educators regarding the importance of a healthy balance of digital and non-digital activities and sufficient rest.
## XI. Education, leisure and cultural activities
### A. The right to education (arts. 28, 29)
The digital environment can enable and enhance children’s access to quality education, including resources for formal, informal, peer-to-peer and self-directed learning. Children highlight the importance of digital technologies in improving their access to education, as well as in supporting their formal and informal learning and participation in extracurricular activities.50
These resources can support children to engage with their own creative and cultural practices and to learn about those of others.51 States should enhance children’s online learning and encourage awarding children with certification when needed to prove their participation.
States should support educational and cultural institutions such as archives, libraries and museums to make available to children diverse digital and interactive learning resources, including indigenous resources and resources in the languages that children understand.
For children attending school, digital educational technologies can support engagement between teacher and student and among peer learners. For children not physically present in school or living in remote areas or in disadvantage or vulnerable situations, digital educational technologies can enable distance or mobile learning programmes.52 States should ensure that schools have sufficient resources to provide parents with guidance on online home schooling and learning environments.
States should invest equitably in technological infrastructure in schools, ensuring the availability of sufficient number of computers, quality connectivity and electricity, teacher-training on the use of digital educational technologies, and timely maintenance of school technologies. States should support the creation and dissemination of diverse digital educational resources of good quality and ensure that existing inequality is not exacerbated by problems regarding access to such resources.
States should develop evidence-based standards and guidance for schools and other bodies responsible for procuring and using educational technologies and materials to ensure these deliver valuable educational benefits. These standards for digital educational technology should ensure that uses of these technologies enhance children’s rights and do not expose children to violence, discrimination, misuse of their personal data, commercial exploitation or other infringements of their rights, including the use of digital technology to document a child’s activity and share it with parents without the child’s knowledge or consent.
States should ensure that schools teach digital literacy as part of the basic education curricula from the earliest years, and such teaching should be evaluated for its outcomes.53 This curricula should include the skills to handle a wide range of digital tools and resources and those related to content, creation, collaboration, participation and civic engagement. It should include the critical understanding needed to find trusted sources of information and to identify misinformation and other forms of biased or false content; sexual and reproductive health issues relevant to the digital environment; knowledge about human rights, including the rights of the child and of others in the digital environment, and available forms of support and remedy. Also, it should promote awareness of the risks of children’s exposure to potentially harmful content, contact and conduct, including cyberbullying and other forms of violence, and coping strategies to reduce harm and build children’s resilience.
It is of increasing importance that children gain an understanding of the digital environment including its infrastructure, business practices, persuasive strategies, uses of automated processing and personal data and surveillance. Teachers who undertake digital literacy education, including sexual and reproductive health education, should be trained on providing digital literacy education and on safeguarding as it relates to the digital environment.
### B. The right to culture, leisure and play (art. 31)
The digital environment promotes children’s right to culture, leisure and play, which is essential for their well-being and development.54 Children of all ages report that they find pleasure, interest and relaxation through engaging with a wide range of media of their choice,55 as well as concern that adults may not understand their digital play and how it can be shared with friends.56
Digital forms of culture, recreation and play should support and benefit children, and reflect and promote children’s diverse cultural identities, languages and heritage. This can facilitate children’s social skills, learning, expression, creative activities such as music and art, sense of belonging, and a shared culture.57 Participation in cultural life online enables creativity, identity, social cohesiveness and cultural diversity, and States should ensure that children can participate in online cultural life and express themselves.
States should regulate and provide guidance for professionals and parents, and collaborate with digital providers as appropriate to ensure that digital technologies and services that are intended for, or may be accessed by, children in their leisure time are designed, distributed and used in ways that enhance children’s opportunities for culture, recreation and play.
States should ensure that the promotion of opportunities for culture, leisure and play in the digital environment is balanced with the provision of attractive alternatives in the physical locations where children live. In the early years especially, children’s language, co-ordination and social skills, and emotional intelligence are largely gained through play that involves physical movement and direct face-to-face interaction with other people. For older children too, play and recreation that involve physical activities, team sports and other outdoor recreational activities can provide health benefits, as well as functional and social skills.
Leisure time spent in the digital environment may expose children to risks of harm, for example through surreptitious advertising or highly persuasive or even gambling-like design features. By introducing or using data protection, safety-by-design and other regulatory measures, States should ensure that businesses do not target children using these or other techniques designed to prioritize commercial interests over those of the child.
Where States or businesses provide guidance, age ratings, labelling or certification regarding certain forms of digital play and recreation, these should not curtail children’s access to the digital environment as a whole or interfere with their leisure opportunities.
## XII. Special protection measures
## A. Protection from economic, sexual and other forms of exploitation (arts. 32, 34, 35 and 36)
Children should be protected from all forms of exploitation prejudicial to any aspects of their welfare in relation to the digital environment. This may occur in many forms, such as economic exploitation including child labour, sexual exploitation and abuse, sale, trafficking and abduction of children, grooming children to participate in criminal activities including hacking, and/or financial crimes.
By creating and sharing content, children may be economic actors in the digital environment. The Committee notes that where children are involved in the production and distribution of content, this may constitute potential for their economic and possibly other forms of exploitation. States should review relevant laws and policies to ensure that children are protected against economic and other forms of exploitation and that their rights with regard to work in the digital environment and related opportunities for remuneration are protected. States should also inform parents and children about protections that apply, and ensure that appropriate enforcement mechanisms are in place.58 States should legislate to ensure that children are protected from harmful goods (such as weapons or drugs) or services (such as gambling). Robust age verification systems should be used to prevent children accessing products and services that are illegal for them to own or use. Such systems should be consistent with data protection and safeguarding requirements.
States should legislate to ensure that children are protected from crimes such as fraud and identity theft in the digital environment, and to allocate sufficient resources to ensure that such crimes are investigated and prosecuted. States should also require a high standard of cybersecurity, privacy- and safety-by-design in the digital services and products that children use, to minimize the risk of such crimes.
## B. Administration of child justice (art. 40)
As users of digital technology, children increasingly find themselves at the receiving end of cybercrime laws. States should ensure that law makers consider the effects of such laws on children, should focus on prevention, and create alternatives to a criminal justice response in all but the most serious of cases.
## C. Protection of children in armed conflict, migration and other vulnerable situations (arts 22 and 38; OPAC)
The digital environment can provide children living in vulnerable situations, including children in armed conflict, internally displaced children, migrant, asylum-seeking and refugee children, unaccompanied children, children in street situations and children affected by natural disasters, with access to life-saving information vital for their protection. The digital environment can also enable them, where necessary, to maintain contact with their families; to access education, health and other basic services; as well as obtain food and safe shelter. States should ensure the safe and beneficial access of children to the digital environment, and ensure their protection from violence, exploitation and abuse.
States should ensure that children are not recruited or used in conflicts, including armed conflicts, through the digital environment. This includes preventing, criminalizing and sanctioning different forms of online solicitation of children, for example through social networks or chat services in online games.
## XIII. International and regional cooperation
The transnational nature of the digital environment necessitates strong international and regional cooperation to ensure that States, businesses and other actors effectively respect, protect and fulfil children’s rights in relation to the digital environment. To this end, States should engage with national and international NGOs, UN agencies, businesses and organisations specialised in human rights in the digital.
States should promote and contribute to the international and regional exchange of expertise and good practices, and establish and promote capacity-building, resources, standards, regulations and protections across national borders that enable the realization of children’s rights in the digital environment.
## XIV. Dissemination
States should ensure that the general comment is widely disseminated to all relevant stakeholders, in particular parliaments, governmental authorities, the judiciary, business enterprises, the media, educators and civil society and public at large, and be made available in multiple formats and languages so as to reach, in particular, children, parents and caregivers. States should use digital technologies for widespread dissemination, including to children.
GE.
---
1 Quotations are from the children’s consultation in 2019 for the present general comment. Respectively, they are from: a 15-year old girl, Canada; a 13-year old boy, Croatia; a 13-year old boy, Brazil; a group of children from Ghana; a child, Croatia; a 15-year old girl, UK; and a 15-year old boy, Portugal.
2 CRC/C/GC/9, para. 37-38.
3 CRC/C/GC/14, para. 1.
4 CRC/C/GC/24, para. 22; CRC/C/GC/20, paras. 9-11.
5 Children’s consultation.
6 CRC/C/GC/14, paras. 89-91.
7 CRC/C/GC/7, para. 17; CRC/C/GC/20, para. 18.
8 CRC/C/GC/20, para. 20.
9 CRC/C/GC/20, para. 20.
10 CRC/GC/2003/5, para. 45; CRC/C/GC/14, para. 99; CRC/C/GC/16, para. 78-81.
11 CRC/GC/2003/5, para. 37.
12 CRC/GC/2003/5, paras. 27, 39.
13 CRC/C/GC/19. para. 21.
14 CRC/C/GC/19, para 27(b)
15 CRC/GC/2003/5, paras. 48, 50.
16 CRC/GC/2002/2, paras. 2, 7.
17 CRC/GC/2002/2, para. 7.
18 Children’s consultation.
19 CRC/C/GC/16, para. 62-63.
20 CRC/GC/2003/5, para. 24.
21 CRC/GC/2003/5, para. 24.
22 CRC/C/GC/16, paras. 66-67
23 CRC/C/GC/16, para. 30.
24 CRC/C/GC/7, para. 35; CRC/C/GC/20, para. 47.
25 CRC/C/GC/17, para. 46; CRC/C/GC/20, paras. 47-48.
26 CRC/C/GC/16, para. 58; CRC/C/GC/7, para. 35.
27 CCPR/C/GC/34, para. 43
28 CRC/C/GC/16, para. 19, 59
29 CRC/C/GC/16, para. 58, 61.
30 Children’s consultation.
31 CRC/C/GC/17, para. 21; CRC/C/GC/20, paras. 44-45.
32 Children’s consultation.
33 CCPR/C/GC/37, paras. 6, 34.
34 Children’s consultation.
35 CCPR/C/GC/16, para. 10.
36 CCPR/C/GC/16, para. 10; CRC/C/GC/20, para 46.
37 CRC/C/GC/20, para. 60.
38 CRC/C/GC/24, para. 101.
39 CRC/C/GC/24, para. 100.
40 CRC/C/GC/16, para. 60.
41 Children’s consultation.
42 CRC/C/GC/21, para. 35.
43 Children’s consultation.
44 CRC/C/GC/20, para. 59.
45 CRC/C/GC/20, paras. 47, 59.
46 CRC/C/GC/20, para. 47-48.
47 CRC/C/GC/15, para. 84.
48 CRC/C/GC/17, para. 13.
49 CRC/C/GC/15, para. 77.
50 Children’s consultation.
51 CRC/C/GC/17, para. 10.
52 CEDAW/C/GC/31-CRC/C/GC/18, para. 64; CRC/C/GC/11, para. 61; CRC/C/GC/21, para. 55.
53 CRC/C/GC/20, para. 47.
54 CRC/C/GC/17, para.7.
55 Children’s consultation.
56 CRC/C/GC/17, para. 33.
57 CRC/C/GC/17, para. 5.
58 CRC/C/GC/16, para. 37.