The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
***
>> ANANYA SINGH: Good afternoon, good evening to all those
who have joined us on-site and online. Welcome to the Data
Protection for Next Generation, Putting Children First. I would
like to encourage on-site and remote participants to scan the QR
code which will be available on the screen shortly. Or use the
link in the chat box to express your expectations from this
session. As a reminder I would like to request all the speakers
and the audience who may ask questions during the Q&A round to
please speak clearly and at a reasonable pace. I would also like
to request everyone participating to maintain a respectful and
inclusive environment in the room and in the chat.
For those who wish to ask questions during their Q&A round,
please raise your hand. Once I call upon you, you may use the
standing microphones available in the room. And while you do that,
please state your name and the country you are from before asking
the question.
Additionally please make sure that you mute all the other
devices when you are speaking so as to avoid any audio disruptions,
in you are participating online and have questions or comments and
would like the moderator to read your question type it in the chat
box. When posting start and end your sentence with a question mark
to indicate that it is a question. Or use a full stop to clearly
indicate that it is a comment.
Thank you. Let us now begin this session.
Ladies and gentlemen, thank you very much again for joining
today's session. I am Ananya, I will be the on-site moderator for
today's session. Mariam from Gambia will be the online moderator
and Neli from Georgia will be the Rapporteur for this session.
Today we embark on a jurn that I that transcends the the boundaries
of traditional discourse and into the realm of safeguarding
children's lives. In this age of advancements we had find
ourselves standing at a pivotal juncture where the collection and
utilization of children's data has reached unprecedented heightses.
From the moment their existence becomes evident, their digital
footprints begin to form. Shaping their online identities even
before they can comprehend the implications. Ultrasound images,
search engine inquiries, the vast Web of interconnected plat fors
weaves a taptry of data capturing every heartbeat and interactions.
Amid this digital tapestry lies a profound challenge the protection
of children's data and right to privacy. Children due to their
tender age and limited understanding may not fully grasp the
potential risks, consequences ans safeguards associated with the
processing of their personal information. They're often left
vulnerable, caught in the crossfire between their innocent
expiration of the online world and complex Web of data collecting
institutions. Today we are gathered here to delve deeper into the
discourse on children's online safety. Moving beyond the usual
topics of cyber bullying and Internet addiction. Our focus will be
on answering the following questions. How do we ensure that
children in different age groups understand value and negotiate the
self and privacy online. What capabilities of vulnerabilities
affect their understanding of their digital data and digital
rights. What is good age verification mechanism so that such
mechanism does not in inset end up checking even more personal
data. And finally, how can we involve children as akive partners
in the development of data governance policies and include their
volume offing capabilities, relief experiences and perceptions of
digital world to ensure greater intergenerational justice, laws
strategies and programs. We hope this workshop will help the
attendees unlearn the current trend of universal and often adult
treatment of all users, which fails to respect children's evolving
capacity lumping them into overly broad categories. Attendees will
be introduced to the ongoing debate of digital consent. Panelists
will elaborate on the children's data self, participants will be
given the flavor of varying national and international conventions
concerning the rights of children regarding their data. As our
speakers come from a range of stakeholder groups they will provide
the attendees with the detailed idea on how a multistakeholder
intergenerational child centered, child rights based approach to
data gov nons policies and regulations can be created. I invite
you all to actively engage in this session to listen to our
esteemed panelists and ask questions, contribute insights and share
perspectives. I would like to introduce our speakers for today.
To begin with we have professor Sonia Livingstone, a professor at
the department of media and communications at the London school of
economics. She has published about 20 books and advised the
UNcommittee on the rights of the child, OACD, and UNICEF on
children's safety, privacy and rights in the digital environment.
Next we have Edmon Chung who serves as the board of Asia on
the board of ICAN, I sok, Hong Kong and found the the international
film festival and participates on Internet Governance matters.
Next we have nej nenl nem, senior program analyst in the division
of USAID, efforts to Internet affordable, protecting children and
youth from digital harms. Next we have Emma Day, a human rights
lawyer specializing in human rights and technology and also the
co-founder of tech legality. She has been working on human rights
issues for more than 20 years now and has lived for five years in
Africa and six years in Asia. And last but not least, we have
Theodora Skeadas, who is a technology policy expert. She consults
with Civil Society organizations including but not limited to the
endowment for national peace, national democratic institute,
committee to protect journalists and partnerships on AI. I would
like to move to the next segment. I invite our speakers to take
the floor and convey their opening remarks to our audience. I
invite professor Sonia Livingstone to please take the floor.
>> SONIA LIVINGSTONE: Thank you very much for that
introduction. It's wonderful to be part of this panel. So I want
to talk about children's right to privacy in the digital
environment and as with other colleagues here I'll take a child
rights focus recognizing holistically the full range of children's
rights in the convention on the rights of the child and then homing
in on article 16 on the importance of the protection of privacy.
So I was privileged be to part of the drafting group for
general comment No. 25. Which is how the committee on the rights
of the child specifies how the convention applies in relation to
all things digital. And I do urge people to read the whole
document.
Just here highlighted a few paragraphs are the importance of
privacy. And the importance of taking -- of understanding and
implementing children's privacy often through data protection and
through privacy by design. As part of a recognition of the wider
array of children's rights. So to respect privacy, must be
proportionate, part of the best interests of the child. Not
undermine children's other rights, but ensure their protection.
And I really put these paragraphs up to show that we are
addressing something complex in the off-line world and even more
complex I fear in the digital world. Where data protection
mechanisms are often our main but not only tool to protect
children's privacy in digital contexts.
I'm an academic researcher, a social psychologist. In my own
work I spend a lot of time with children seeking to understand
exactly how they understand their rights, their privacy, and we did
an exercise as part of a -- some research a couple of years ago
that I wanted to introduce the types of privacy and the ways in
which children as well as we could think about privacy.
So as you can see, on the screen, we did a workshop where we
asked children their thoughts on sharing different kinds of
information with different kinds of sources. With different
organizations. What would they share and under what conditions
with their school, with trusted institutions, like the doctor or a
future employer.
What would they share with their online peers and contacts?
What would they share with companies and what did they want to keep
to themselves?
And we used this as an exercise to show that children know
quite a lot, they want to know even more, and they don't think of
their privacy only as a matter of their personal, their
interpersonal privacy. But it is very important to them that the
institutions and the companies also respect their privacy. And if
I can summarize what they said in one sentence, the idea that
companies, which take their data and exploit their privacy, the
children's cry was, it's none of their business. And the irony
that we are dealing with here today is that it is precisely those
companies' business.
We can see some similar kinds of statements from children now
around the world. In the consultation that was conducted to inform
the UN committee on the rights of the child's general comment 25,
and as you can see here, children around the world have plenty to
say about their privacy. And exactly understand it both as a
fundamental right in itself, and also as important for all their
other rights. Privacy mediates safety. Privacy mediates dignity.
Privacy mediates the right to information and so forth.
Many more.
I think we're now in the terrain of looking for regulatory
straefrnls as well as educational ones, I was asked to mention I
think this panel will discuss the idea of age appropriate design
codes, particularly as one really proving a valuable mechanism.
And we will talk further about this, I know. But the idea that
children's privacy should be respected and protected in a way that
is appropriate to their age, and that understands the link between
privacy and children's other rights. I think this is really
important. And we see this regulatory move now happening in a
number of different international and national contexts.
I spent the last few years working with the five rights
foundation as part of the digital -- running the digital futures
commission. And I just wanted to come back to that holistic point
here. In the digital futures commission we ask children to comment
and discuss all of their rights in digital contexts, and not just
as a research project, but as a consultation activity. To really
understand what children think and want to happen and what -- to be
heard. On a matter that affects them. And privacy online is
absolutely a matter that affects them.
And we use this to come up with the proposal for child rights
by design. Which builds on initiatives for privacy by design.
Safety by design, security by design. But gsz beyond to recognize
the holistic nature of children's rights. And so here we really
pulled out 11 principles, based on all the articles of the UN
convention on the rights of the child and on general comment 25.
And so you can see that privacy is a right to be protected in
the design of digital products and services, as part of attention
to children's and the age appropriate service, building on
consultation, supporting children's best interests. Promoting
their safety, well being, development, and agency.
And I will stop there. And I look forward to the discussion.
Thank you.
>> ANANYA SINGH: Thank you very much professor liefg stone
that was very insightful. We will now move to Edmon. Would you
like to take the floor?
>> EDMON CHUNG: Hello. Thank you. Thank you for having me.
And eed monofrom South Asia. We will be sharing -- I guess
building on what Sonia just mentioned, we will be sharing a little
bit about our work at dot kids, which actually also kind of trying
to operationalize the convention on the rights of the child.
But first of all, I just want to give a quick background, why
dot Asia is involved in this. Dot Asia ourselves is a -- obviously
operates the dot Asia top level domain. You can have domains as
whatever, dot Asia. That provides the income source for us. And
so every dot Asia domain actually contributes to the Internet
development in Asia. One of the things -- some of the things that
we do include youth engagement and we actually are very proudly --
proud that the net mission program is the longest standing youth
Internet governance engagement program. That sort of built our
interests or our awareness to supporting children's and children's
rights online back in 2016 we actually launched a little program
that looked at the impact of sustainable development goals and the
Internet. And we recently launched an eco Internet initiative.
But I'm not going to talk about that.
What I want to highlight is that engaging children on
platforms including domain, top level domains, is something that I
think is important. And one of the things that I would like to
share.
So on the specific topic of dot kids, actually the dot kids
initiative started more than a decade ago in 2012 when the
application for dot kids was put in to -- through ICAN for the dot
kids top level domain. Right at that point actually there was an
engagement with the children's rights and children's welfare
community about the process itself. But I won't go into detail.
What I did want to highlight is that part of the vision of dot kids
is actually do engage children to be part of the process in
developing policies that affect them. And to involve children's
participation and so on, and in fact in 2013 during the process we
were going through the ICAN process, we actually helped support one
of the -- well, the first children's forum that is focused on the
ICAN process itself. And that was held in April of 2013.
Fast forward ten year, we were finally able to put dot kids in
place in late -- well actually last year. But the dot kids top
level domain actually entered the Internet on April 4th of 2022.
And was launched late last year in November, 29th. Of 2022. So it
is less than a gear old. So really not even a toddler. For dot
kids.
But let's focus on the difference. I mean, that between dot
kids and for example dot Asia or dot com. Right? Is it.
I mean, one of the interesting things is that at the ICAN
level there is no difference. For ICAN, you know, operating a dot
kids would be exactly the same as operating dot com.
We disagree and that's why we engaged into the decade-long
kind of campaign to operate dot kids and believe that there are
policies that are required above and beyond just a regular
registry, a regular dot com or dot wherever. Because there's not
only a set of expectations, there are -- it is important for -- and
here's why we say it's the kids' best interest domain. That is the
idea behind dot kids, let's look at part of the registry policies.
But for dot kids ourselves, if you, you know, think about it,
of course we don't keep children's data or data about kids. But
does that mean we don't have to have policies that actually is
around the registry or for dot kids domains itself?
Well, we think no. And building off what professor living
stone was saying, in fact we have a set of guiding principles at
that was developed with the support from the children's rights and
welfare community and based on the convention of the rights of
child. And of course there's additional kids-friendly guidelines,
there's kids and tie abuse policies and personal protection
policies. Highlight the entire guiding principles is based on the
convention of the rights of the child and probably not all the
articles, but certainly articles that outlines protection and
prohibited materials. It kind of -- way to think about it is
probably that all four of the dot kids domain, we do enforce and to
ensure that restricted content -- and the best way to think about
it is really that if you think of a movie, the restricted content
or the rated R movies with obviously not be acceptable on dot kids
domains.
But on top of that we also have specific privacy provisions
also built on article 16 as Sonia mentioned earlier. And some
other as specks that is around the conventions of the right of
child:
So we think that is -- there's something that is important and
is being built into it. And we're probably -- we definitely the
first registry that is built policies around convention on the
rights of the chielt. But we're also one of the very few domain
registries that would actually actively engage in suspension of
domains or processes to deal with restricted content.
Beyond that there's a portal and a platform to report abuse
and to alert us on issues and in fact I can report that we have
already taken action on abusive content and restricted content and
so on.
But I will like to end with a few items. There are certainly
a lot of abuses on the Internet, but the abuses that is appropriate
for top level domain registries to actually take is a subset of
that. There are many other abuses that happen on the Internet, and
you know, there are different types of DNS abuses and different
types of cyber abuses that may or may not be effective for the
registry to take care of.
And that's I guess part of what we discuss, that's why we
bring it to this -- to IGF in these type of forums to discuss is
because there are other stakeholders that needs to be -- to help
support a safer environment online for children.
So with that I -- there are a number of acts that are -- that
are put in place in the recent years, and I think dot kids will try
to -- is a good platform to support the kids safety online bill in
the U.S. and on the online safety bill in the U.K.
. We do believe that collaboration is required in terms of
security and privacy. And one of the vision, as I mentioned for
dot kids, is to engage children in the process and we hope that we
will reach there soon. But you know, it's still in its toddler
phase so it doesn't generate enough fwhk for us to bring everyone
here. But the visitself is to put the policies and protection in
place and also into the future be able to support children's
participation in this Internet governance discussion that we have.
>> ANANYA SINGH: Thank you. That was very inspiring. Let's
now go to.
>> NJEMILE DAVIS: . Thank you, thank you for giving me the
opportunity to peek about the work in this area.
So U.S. aid is a ind at the present time agency of the United
States government where I work with 9,000 colleagues. And 100
countries around the world to provide humanitarian relief and fund
international development.
In the technology division where I sit, there are a number of
initiatives that we support related to digital innovation from
securing last mile connectivity to catalyzing national models of
citizen-facing digital government. And we work in close
collaboration with our U.S. Government colleagues in Washington to
inform and provide technical assistance, to support locally led
partnerships, and to create the project ideas and infrastructure
needed to sustain the responsible use of digital tools.
Although we rely consistently on researching, developing and
sharing best practices, our activity design can be as varied as a
specific country and community context in which we are called to
action. Indeed the many interconnected challenges that come with
supporting the development of digital societies has challenged our
own evolution as an agency.
So in early 2020 we launched USA's first digital strategy to
articulate our internal commitment to technological innovation, as
well as for the support of open, inclusive and secure digital
ecosystems in the countries we serve. Through the responsible use
of digital technology. So that digital strategy is a five-year
plan that is implemented through a number of initiatives and there
are some that are particularly relevant to our work with young
people. Specifically we have made commitments to improve digital
literacy, to promote data literacy through better awareness,
advocacy and training for data privacy, protection, and national
strategies for data governance. To improve cyber security, to
close the gender dij tam divide and address the disproportionate
harm women and girls face online. And to protect children and
youth from digital harm.
Each of these initiatives is supported by a team of dedicated
pro professionals that allow us to work at the intersection of
children and technology. Digital tools play an increasingly
important role for adults working to protect children, for example
by facilitating birth registration, providing rapid family tracing,
supporting case management, and by using better, faster analysis of
the data collected to inform the effectiveness of these services.
And they can also play a role in the development and integration of
children themselves into larger social and cultural norm it is by
providing a place to learn, play, share, explore, and test new
ideas. Indeed many children are learning how to use a digital
device before they even learn how to walk.
However, we also know that increased digital access also means
increased risk. And so in the context of protecting children and
youth from digital harm, US aid detines digital harm as any
activity or behavior that takes place in the digital ecosystem and
causes pain, trauma, damage, exploitation or abuse, directly or
indirectly in either the digital or physical world, whether
financial, physical, emotional, psychological, or sexual.
For the estimated one in three Internet users who are
children, these include risks that have migrated onto or off of
digital platforms that enable bullying, harassment, technology
facilitated gender based violence, hate speech, sexual abuse and
exploit Ace. Recruitment into trafficking and radicalization into
violence. Because digital platforms share copious amounts of data
our colleagues who have done an incredible amount of highly
commendable work at UNICEF for example around children's data, as
well as my kol healings on today's panel will likely agree that
there are other perhaps less obvious riss. For example we observed
in recent years that children seem to have given up or in uniformed
consent of their data collection. Probably due to their naivete
and trust of the platforms in which they're engaging.
But a lack of informed decision-making about data privacy and
protection effectively transfers power from the data subject to the
data collector and the consequences of this can be long-lasting.
The number of social media likes, views and shares are based on
highly interactive levels of data sharing. Affecting children's
emotional and mental health. Data algorithms can be leveraged to
profile children's behavior, narrowing exposure to new ideas,
limiting perspective and even stunting critical thinking skills.
Data leaks and privacy breaches that are not just harmful on their
own but can be orchestrated to cause intentional damage is another
risk. We can counter act these and other challenges by helping
practitioners understand the risk to children's data and to ensure
accountability for bad actors.
The theoretical physicist Albert Einstein famously quoted
saying that if he had one hour to solve a problem he would spend 55
minutes thinking about the problem and only five minutes on the
solution.
And this year amount of data that we generate and have access
to means that our vision of solving the global challenges we face
with data is still very much possible. Especially as we are
realizing unprecedented speeds of data processing that are fuelling
innovations and generative AI, or enable the use of 5G and that we
will see in quantum computing. As we celebrate the 50th berth day
of the ipt net at this year's IGF, it's amazing to think how much
all of us here have been changed by the technological innovations
paved by the Internet. In that same spirit of invasion, we're opt
mistake -- can help mitigate the risk we see today and lefrnl z to
create a more inclusive and more exciting world of tomorrow, which
is the Internet our children want.
>> ANANYA SINGH: Thank you very much, Njemile. And Man Hei,
would you like to take the next?
Emma, are you here with us?
>> EMMA DAY: Thank you, yes. Can you see my screen?
Yes, please go ahead.
>> EMMA DAY: Thank you. I've been asked to answer how Civil
Society organizations can tackle the topic of child centered data
protection. I think this is a multistakeholder issue and there are
many things that Civil Society organizations can do. As a lawyer
I'm going to have a focus on the more law focussed ideas. There
are three main approaches that I have identified. The first is
Civil Society organizations can engage in advocacy related to law
and policy. Second they can engage themselves in litigation. And
request regulators, I should say. And third, they can carry out
community-based human rights impact assessments themselves. So the
first example of advocacy related to law and policy, the target is
policy-makers and regulators, ago an example of this I was involved
in a project that was led by professor Sonia Livingstone, also on
this panel. This was part of the U.K. digital futures commission.
Chk it was a project which involved a team of social scientists and
lawyers, we looked in detail as how the use of Ed in schools is
governed in the U.K.
We found it's not very clear whether the use of Ed tech in
schools was covered by the U.K. age appropriate design code or
children's code. The situation of data protection for children in
the education context was very uncertain. We had a couple of
meetings with the ICO. And the digital futures commission had a
group of high level commissioners they had got together from
government, civil society, the education sector and the private
sector. And they helped to -- held two public meetings about the
use of Ed tech in U.K. schools. Subsequently in May 2023 the ICO
published updated guidance on how the children's code applies to
the use of Ed tech in U.K. schools. I won't go into the details
about guidance now but suffice to say this was much needed
clarification and it seemed to be as a result of our advocacy,
this was not specifically stated.
The second example, it's a Civil Society organizations
engaging themselves in litigation and requests to regulators. So
Civil Society organizations have lawyers as part of their staff or
they can work with lawyers and other experts. So an example of
this is an organization in the U.S. called fair play. In 2018
they led the coalition asking the Federal Trade Commission to
investigate -- for violating the children's privacy protection act
by collecting puzzle information from children on the platform
without parental consent. As a result of their complaint Google
and YouTube were required to pay a record 170 million dollars fine
in the settlement in 2019. With the Federal Trade Commission.
So in response rather than getting required parental
permission before collecting personal information from children on
YouTube, Google claimed instead it would would limit data
collection and eliminating personal advertising on their made for
kids platform. So perhaps they will wanted to check if YouTube
had really eliminated personal advertising or make the kids
projects. And they run their own test by buying some personalized
ads. Fair play says that their test proves that ads are made for
kids video. In fact still personalized and not contextual. Which
is not supposed to be possible at copper, in fact they wrote to
the Federal Trade Commission in August 2023 and made a complaint
and asked them to investigate and to impoles a fine of upwards of
tens of billions of dollars, we don't know the outcome of this
yet. That complaint was only put in in August of this year.
And then the third solution which I think is a really good one
for Civil Society organizations, which I haven't really seen done
completely in practice yet is to carry out community-based human
rights impact assessments. So companies themselves carry out
human rights impapg assessments but it's something that can be
done at community level. And this involves considering not just
data protection but also children's -- human rights as well. A
multidisciplinary effort. So involves consulting with a company
about the impact of their products and services on children's
rights. Perhaps working with technical experts to test what's
actually happening with children's data throughout some platforms.
And working with legal experts to assess whether this complies
with laws and regulations. Crucially this should involve
meaningful consultation with children. I think we're going to
talk later about what meaningful consultation with children really
looks like.
I'm going to leave it there because I think I'm probably at
the end of my time. But looking forward to discussing further.
Thank you.
>> ANANYA SINGH: Thank you very much Emma. And finally,
Theodora, would you like to let us know what your remarks are?
>> THEODORA SKEADAS: Hi everybody it's great to be here with
you. Let me just pull up my slides.
Alrighty.
Okay. Hold on one second.
Let me just grab -- okay.
Great. So...alrighty. So it's great to be here with all of
you. And I'll be spending a few minutes talking about key
international children's rights principles, standard the and
conventions. As well as major issue areas around personal data
collection, processing and profiling. And then some regulation
legislation to be keeping an eye out for. So I'll start with
standards and conventions and then turn to principles. Some of the
major relevant standards and conventions that are worth discusses
are listed here which include the UN convention on the rights of
the child. A widely ratified international human rights treat at
this which inshines the rights of all children under age 18. A
number of provisions relevant to children's data protection such as
the right to privacy, the right to the best interests of the child,
and the right to freedom of expression.
Also the UN guidelines for the rights of the child as it
relates to the digital environment in 20 dwun, these guidelines
provide guidance around how to apply the UN CRC, or rights of the
child to children's rights in the digital environment. And include
a number of provisions relevant to children's data protection like
the right to privacy and confidentiality. The right to be informed
about the collection and use of data and the right to have data
erased. The general protection regulation, a comprehensive data
protection law that applies to all organizations that process data
for those in Europe although sometimes this has been extended
beyond specifically for companies or employers that are
international and exist beyond the European area. Includes
provisions for children as well. And the children's online privacy
protection act in the U.S. is a federal protection law that
protects the privacy of children under age 18 and requires parental
consent before using children ace personal information. Some of
the principles that are important to discuss here include data
collection, data use, data storage and security, data access, and
erasure, transparency and accountability. So this means that
organizations should only collect data for legitimate purposes and
with the consent of parents and guardians. Data use is that
organizations should use children's data in a way that is
consistent with their best interest. Data storage and security,
appropriate security measures to protect children. Data access and
erasure, organizations should give chirp and their parents or
guardians access to data and right to have it ear racialed.
Organizations should be trance parent about whatter they're
doing to make sure they're protecting children. Age appropriate
design, privacy by default frpgs data minimization and data
controls prouds and services should be involved with the best
interest in mind and for age and developmental stage. Privacy by
default products and services should be developed with privacy in
mind.
And data minimization, products and services should only
collect and use the minimum amount of data required and product
controls, provide parents with meaningful control over their
children's online activities.
So major issues around personal data collection, pros shesing
and profiling that are in discussion today. Include consent, so
children may not fully understand what it means to consent to the
collection and use of their personal data. That's also true for
adults. But especially true for children.
Transparency. Organizations may not be transparent about how
they collect, use and share children's personal data. Which can
be -- make it difficult for parents to make informed decisions
about their children.
Data minimization, so organizations often collect more
personal data than is necessary for the specific purpose. And this
excess data can have other purposes like targeted ads,
preprofiling. Daily at that security, organizations may not be
implementing adequate security measures to protect data of chifrp
from unauthorized access, and destruction which can put children at
risk. And foe filing, organizes may use children's data to create
profiles with used to target children with advertising and content
that might not be in their best interests.
Additionally strengthening legal protection. There's an
ongoing conversation around how governments can strengthen legal
protections for children. Such as requiring parental consent and
prohibiting organizes from profiling children through targeted
advertising. Raising awareness. Thooer is a huge conversation
ongoing now about how prirnts and children should be educated about
risks and benefiting about sharing information on line. Alsoism
proved transparency and accountability. ORSes should n transparent
about how they collect, use and share children's personal data and
be accountable for that date that. Last designing -- products and
services that collect and use less personal data from children.
And also that help children and parents manage their privacy
online.
So next we'll look at regulation and legislation. We've been
seeing a huge amount of regulation and legislation in this space in
the U.S. context we've seen some, U. svenlt federal bills, but
because those haven't passed we've been seeing a transition to
state-level bills. So I want to pull up -- there we go. So this
is is a piece that I wanted to share that talks about bills in the
area. That we've seeing in the U.S. Near is here a compilation
of 147 bills. Across the U.S. states. Not all represented but a
lot of them are. Interestingly, dates across the political divide.
And you can see here the legislation that's in discussion includes
themes like age verification, instruction, parental consent, data
privacy. Technologies, access issues. More age verification.
That's clearly a recurring theme. Recommendations on the basis of
data. Et cetera. And you can see here there's some categories.
We see law enforcement, parental consent, age verification,
privacy, school-based. Hardware, filtering. Perpetrator. That
looks at safety. Algorithmic regulation and more. And then we can
see the methods. These include third parties, state digital I. did
thes, commercial provider. Government (I.D.s) I.D.s, self
attestation. And you can see what ages these are targeting, mostly
age 18, a few look at 13 and sometimes other ages as well.
And then the final categories of analysis look at access
limited content or services, enforcement types and status. And I
think that is it. Thank you so much.
>> ANANYA SINGH: Thank you very much, Theodora. I have
received a request from the audience. If you could kindly share
the link to the website that you were sharing with us, that would
be great.
. It was a very, very good remark. Thank you very much.
Okay. So we will now be moving onto the next segment where I
will be directing questions to each of our speakers. We will begin
with professor Sonia Livingstone. While I had a set of questions
prepared for you, professor livingstone I think you kind of
answered most of those. Let's pig something from what you have
focussed on in your opening remarks. You mention about age
appropriate design code. I want to know what are your views on
this age appropriate design code for different countries since in
different cultural, national, international and local contexts,
what is appropriate for what age differs. So what would you like
to say about that and how can an age appropriate design code be the
answer in such varying contexts?
>> SONIA LIVINGSTONE: That's a great question and I think
others will want to pitch in. I think my starting point is to say
that if we're going to respect the rights of children online, we
have to know which user is a child. And the history of the
Internet so far is a failed attempt to respect children's rights
without knowing which user is a child. So at the moment we either
have no idea who a user is or we somehow assume or produce a
product, producers somehow assume that the user is an adult, often
in the Global North, often male, and rather competent to deal with
what they find.
So we need a mechanism. And age appropriate design code has
become exactly this mechanism. And I think the extent to which
it's being taken up in the Global North and the Global South shows
the genuine need to identify who is a child.
There are two problems. One, you didn't highlight, but it
does mean that we need to in some way identify the age of every
user in order to know which ones are children. Because we don't
know obviously who is a child before. There was a real set of
questions around the mechanism. And which others have alluded to.
And then as you rightly so, what is appropriate for children of
different ages varies in different cultures. I think I would
answer that by returning to the UN convention on the rights of the
child. It addresses the child rights at the level of the
universal. The right to privacy, the right to safety, the right to
information, the right to civil rights and liberties to participate
and be heard and so forth. We can conceive of children's rights at
the universal level. But there were also many provisions in the
convention and also in general comment 25 about how this can be and
should n adjusted and tailored to particular circumstances. Not to
qualify or undermine children's rights, but to use mechanisms that
are appropriate to different cultures. And I think this will
always be contested. And probably should be. But at heart the --
if you read the age appropriate design codes, they focus on the
ways in which data itself is used by companies in order to support
children's rights.
Rather than setting a norm for what children's lives should
look like.
>> ANANYA SINGH: Thank you very much professor Living stone.
That was a fairly detailed and very nuanced answer. Next, Ed mono,
since we are on the subject of age, what do you think is a good age
verification mechanism that does not in itself lead to the
collection of more personal data?
>> EDMON CHUNG: That is a very difficult question. I guess a
few principles to start with, first of all, privacy is not about
keeping data secure and confidential. Privacy the first question
is whether the data should be collected and kept in the first
place.
So in terms of privacy, if it is just an age verification, and
whoever verifies is r it discards or erases or deletes the data
after the verification, there should be no privacy concern. But of
course platforms and providers don't usually do that. And that's
one of the problems. Right?
But the principle itself should be -- just like when you show
your I.D. or whatever, the person takes a look at it, you know, you
go in and that's it. They don't take a picture of it and keep a
record of it. So that's privacy.
To start with. The other thing then we need to probably think
about whether the age verification is to keep children out, or let
children in. Right? I mean that's -- it's a big difference, you
know, in terms of how you would then deal with it.
But especially on weather or not data should be kept or should
be discarded. Now on the actual verification mechanism I think in
fact there is well-developed systems now to do what is called
pseudonymous credentials, the platform or provider doesn't have to
know the exact data but can establish digital credentials with
digital certificates and cryptographic technologies, techniques
such that parents can vouch for the age and complete the
verification without disclosing the child's personal data. I think
these are the mechanisms that are appropriate and more importantly,
I guess I go back to the main thing is that if it is just for age
verification, whatever data that was used should be discarded the
moment the verification is done. And that is the -- that gives you
real privacy.
>> ANANYA SINGH: Thank you very much. That was very
comprehensive. Next, Njemile Davis how is the -- relating to
children's data. Related to children's governance?
>> NJEMILE DAVIS: We spend a lot of time talking about data
governance. That's because data really fuels technology that
others use that generates data in some way or uses data for its
purpose.
And technologies have a tendency to reinforce existing
conditions. And so we want to be really intentional about how data
is used to that end. Data governance is important for a few
basicing reasons. One is because data by itself is not
intelligent. It's not going to govern itself. And because data
mult my plies when you divide. We know the shared amount of data
that we're generating needs to be wrangled in some manner if we're
going to have some control over the tools we're using. So data
governance framework helps us to think about what needs to be
aprevioused with the data, who will make decisions about how the
data is treated and how governance of the data will be immremed.
Writ large we look at five levels chk of data governance
implementation and that's everything from trans national data
governance, down to individual protections and empowerment. That's
really the sweet spot for us in thinking about children. It's
about developing awareness. And agency, about participation in
data ecosystems. In the middle is thinking about sectoral data
governance where we find that they're a highly developed, norms
around data privacy, data decision making, data standardization
that helps structure data driven tools like digital portals for
sharing data. And so we are currently working with Mozilla
foundation on a project similar to the one that we heard Emma
talking about where we are working in India and India's public
education sector to think about data governance interventions
there. India has one of the largest if not the largest school
systems for children in the world. 150 million children are
enrolled in about 1 and a half million schools across the country.
India had one of the largest -- longest periods of shut down during
COVID-19. And Ed tech stepped into that gap very awe true stickily
to try to close gaps in student education. However, as again Emma
has pointed out and we have found in our own research there was
some nuances in the ways that these Ed tech institutions were
thinking about student learning chk compared to the way schools
were.
And so private industries incentivized by number of users, and
not necessarily learning outcomes. There needed to be some clarity
around the types of standards that Ed tech companies are to meet.
There's a reliance on Ed tech replacing teachers, interaction
with students. And data subjects generally lacking awareness about
how their data is used by Ed tech and schools to measure student
progress and learning.
So we're currently working with a number of working groups in
India to really understand how to bridge this gap.
And to synchronize the collection of data and data analysis
that harmonizes analog tools with digital tools. So teachers who
are taking attendance, how does that correlate to scores on Ed type
platforms? So we're focussed right now on the education sector but
we imagine this is going to have implications for other sectors as
well. We're working -- I don't know if I mentioned this is with
Mozilla foundation and working in partnership with Mozilla to look
at responsible and ethical computer science for University students
in India and Kenya and here we're hoping to educate the next
generation of software developers to think more et ibly about the
social impacts of the technology they create including generative
AI. And then going back to chk the protecting children and youth
from digital harm that we're doing, we are extremely proud to be
working alongside and supporting youth advocates through our
digital youth council. We have Ananya, who participated in cohort
1 and nej yes, ma'am who I believe was in the room earlier to
moderate the chat session who are extraordinary samples of the type
of talent we have been able to attract and learn from. In year two
of the cohort we received almost 2700 am kags worldwide. From that
number we selected 12 council members and we're anticipate l just
as fabulous results from them.
And so that's generally how we are thinking about children's
data through our data governance frameworks. Chk.
I think just kind of riffing off of what I have heard today,
we can also advocate through data governance for inclusion and
enforcement of the right I goes of children international data
privacy laws especially as we know in IGF lots of countries are
thinking about how to develop those privacy laws, we should be
advocating for the rights of children to be included. In Civil
Society there's opportunities to explore alternative approaches to
data governance.
Data cooperatives, which are community-driven, can help groups
think about how to leverage their data for their own benefit:
Civil Society perhaps has room to explore the concept of data
intermediaries where they are a trusted third party and that works
on behalf of vulnerable groups like children to negotiate when
their data is accessed. And to also inForbes sanctions when data
is not used in the way that it was intended to. Chk.
Ananya Singh thank you so much. Since Njemile has ash on
Civil Society. Why don't we move to Emma Day and ask her next
question. Emma, how do you think Civil Society organizations could
work with children to promote better data protection for children?
>> EMMA DAY: Thanks so much for the children. I think Jamila
came up with good starting points for this conversation already. I
think to involve children it has to really be meaningful. One of
the difficulties not just with children in -- with consulting with
communities in general on these topics of data governance is it's
very complex and hard for people to understand immediately the
implications that data processing for that range of rights,
particularly projecting into the future and what those future
impact the might be. I think to begin with to make that
consultation meaningful you have to do a certain amount of
education. I think some of the great ways to do this, to involve
children this things like data subject access requests where they
can be involved in the process of writing to a company and
requesting the data that that company is keeping on them so they
can see in practice what's happening with their data and form a
view on what they think about that.
And for children to be involved in these kinds of community
data auditing processes or -- so there is some auditing of AI,
community-based processes that have been going on which I don't
think have involved children so far. Obviously older children
could get involved in these initiatives. Involving children in
conceptualizing how data intermediaries can work best for children
of different ages is really important. This is something we talked
about a couple of years ago now I was one of the authors of the
UNICEF manifesto on data gov nons for children. What Civil Society
organizations can do to involve children. I haven't seen a lot of
this happen in practice chlths another one of the key things that I
would like to see is the Civil Society organizations to involve
children in holding companies accountable. By auditing their
products, by doing these kinds of community-based human rights
impact assessments. And I think we need to think about not just
the platforms and the apps, but also some of the things like age
verification tools, like Ed tech products, like health tech
products, tools that are used in the criminal justice system, that
are used in the social welfare system. Really almost -- technology
product the impack packet almost all areas of children's lives,
private sector companies where they're providing solutions that are
ee slengsly to promote children's rights. We need to ensure that
children involved in auditing those products are making sure that
they really are having a benefit for children's rights.
I think tho do that Civil Society organizations need to ensure
that they involve academics, involve technologists and legal
experts to make sure that they really get it right. Because these
are complex assessments to make.
>> ANANYA SINGH: Thank you very much. Let's move to
Theodora. I know you mentioned a lot about the existing
international standards and children's rights and their data. What
about the regulations and legislations which are underway to
address some of these concerns? Are there any particular areas
where these regulations could do better or any other suggestions
that you might have for any such future conventions?
>> THEODORA SKEADAS: Hi everyone. A really great question.
Thanks, Ananya. I'm going to screen share again so folks can see
the database I was references earlier. I think to me it's not so
much that there are specific technical gaps in what we're seeing
but rather -- and of course this is a U.S. focussed conversation.
It's important to mention that there is legislation being discussed
globally outside of the U.S. as well.
And that legislation that's happening elsewhere is inclusive
of children's safety issues. So for example in the European Union,
transparency related measures like the digital services act and
digital markets act will have impact the on child safety and the
U.K. online safety bill which is underway will also impact child
safety and legislation discussions are happening elsewhere as well.
Within the U.S. where this data set was collected and where my
knowledge is strongest I think it is pretty comprehensive.
Although it's interesting to note that one of the questions that I
saw in the chat touched on a theme that wasn't discussed here in
this legislation. So specifically the question was whether there
was -- I'm just looking through the chat again -- whether there
was -- heir we go. Oh, yeah, laws related -- or legislation
related to assisted Ed tech in schools. Chk I observed there are
four school based policies and two hardware based policies. But
none of them are focussed on assistive Ed tech. The ones that are
focussed on schools look more at like access, age verification,
policies and education. And the hardware ones are focussed more on
filtering and technical access.
Rather --
So you can see those here. Requiring tablets and as a matter
of fact phone manufacturers to have filters. Enabled activation
and only bypassed or switched off with a password.
So you can see that there is a -- quite a range. I think to
me the bigger concern is whether this legislation will pass. We
see a really divided political landscape, and even though we're
seeing a proliferation of data and data-related issues around
children in legislative spaces, the concern is that there isn't
going to be a legislative majority for this legislation to pass.
So it's not per se that I see specific gaps, and more that I
have broader concerns about the viability of legislation and the
quality of the legislation. Because not all of it is equally as
high quality. And so I think the increasing fraught political
landscape that we find ourselves in is making it harder to pass
good legislation. And there are competing interests at play as
well.
Thank you.
>> ANANYA SINGH: Thank you very much. I would now like to
thank all our speakers for sharing their insights with our
attendees and at the same time I would like to thank our attendees
who I see are having a lively chat in Zoom. Hence since you have
so many questions, why don't we open the floor for questions from
the audience.
We would be taking questions from both on-site and online
audience. If you're on-site and if you have a question, you have
to stand -- two stand mics right there. Kindly go to the
microphones and ask your question by stating your name and the
country you're from. And post that we will be taking questions
from the chat.
R.
>> AUDIENCE MEMBER: May I start? My name is Utech control
frchlt the gender tuments foundation there heading a project on
children's rights in the digital environment. First of all let me
stayed I couldn't agree more with what Sonia said in her last
statement that if you don't know the age of all users, age
verification wouldn't make sense. We need to know whether people
are over a certain age, belong to a certain age group, or under a
certain age.
And my question would be we need to adhere to the principle of
data minimization. So whether any of you has already a thought how
we can achieve that without creating a huge amount of additional
data and even the digital services act doesn't allow to collect
additional data just to verify the age of a user.
So it's quite a difficult task. And one has already -- Edmon
has already said if we could trust companies when they do the age
verification that they delete afterwards the data, but I'm not sure
whether we can do so. So that would be my question.
And the second point would also go to the last speaker,
Theodora, that when you give us a good overview on the legislation,
the question would be how could we ensure that legislation that is
underway takes into account from the beginning the rights of
children? Not like it was done in the GDPR in the last minute,
putting a reference to children's rights into the legislation.
Thank you for listening.
>> ANANYA SINGH: Thank you very much. Why don't we lead with
the first half of the question and would any of the speakers like
to take that? And we would then direct the second question to
Theodora.
Yes. Please go ahead.
>>
>> EDMON CHUNG: I'm happy to add to what I already said. In
terms of those cases it's sue namized data. Instead of collecting
the actual data, there is a -- it is very possible for system like
platforms to implement sued minized credentialed systems. Those
vouching for a participant's age could be distributed. Right? I
mean could be schools, could be parents, could be chk your
workplace or whatever. But as long as it is not -- it is a trusted
data storer that does the verification, and then keeps a
pseudonymized credential, the platform should trust that
pseudonymized credential. I think that is the right way to go
about it. The other part I -- I -- as much as I still think it is
the right way to ask for it to be deleted, can we trust companies?
Probably vp probably not. But of course we can have regulation and
audits and those kind of things.
But for trusted anchors, themselves also, whether the school
or whether it's -- whatever trust anchor, that the person actually
gives them age verification to, that institution should also delete
the raw data and just keep the verification record, you know,
verified or not verified. And that's the right way to do privacy
in my mind.
>> ANANYA SINGH: Thank you. Professor Livingstone wants to
add something. Go ahead.
>> SONIA LIVINGSTONE: Eed monosaid what I wanted to say. I
completely agree. I've been part of the European agreement,
consent, trying to find a trusted third party ind meet ri that
would do the sage check, hold the token, and not have it held by
the companies.
So I think there are ways of being found. Clearly the context
of transparency and accountability and kind of third party
oversight is scrutinizing those areas, solutions will really need
to be strong. And that also must be trusted.
I'd add I think we should start this process with a risk
assessment because not all sites need age checks, not all content
is age appropriate for children. One would like -- I would
advocate that we begin with the most risky content and with risk
assessment so we don't just roll out age verification excessively.
I'll end by noting big tech already age assesses us in various
ways. I think the big companies already know the age of their
users. Through a greater degree of accuracy and we have no
oversight and transparency over that. I think the efforts being
made are trying to write what is already happening and already
happening poorly from the point of view of public oversight and
children's rights.
>> ANANYA SINGH: Thank you. Emma?
>> EMMA DAY: Yeah, thank you. I think this is still a
question everyone's grappling with really. And differing views may
be in different jurisdictions around how well age verification
products comply with privacy laws in different countries. I would
wreelly agree with what Sonia said about starting with a risk
assessment. I think we need to look at first what is the problem
we're trying to solve and then is age verification the best
solution. Because to start with, if we're going to process
children's data it it should be necessary pan proportionate. We
have to look at what other solutions there are, that are not
technical first that might address the problem we're trying to
address first. Rather than looking at just age verification across
everything. I think also there's an issue within certainly under
EU laws, sued ninize ation is right to say. Pseudonymized data is
still personal data in the GDPR. Not that straight forward within
the EU to youth auto MissEd data to -- European has not been
settled yet either.
>> ANANYA SINGH: O okay. And Theodora. Any remarks from
you?
>> THEODORA SKEADAS: I think this is a really great question.
It's not easy to ensure that legislation takes into account the
stated rights of children. I would start with education. I think
frankly from my experience interacting with legislators since I
participate in the advocacy process, I found that most legislators
are just underinformed. And so making sure that they understand
what these rights and principles and standards actually are. What
does it mean for the right to privacy? To be manifest in
legislation or like what are the best interests of the child? What
is the right to freedom of expression?
What do we think about the right to be informed when it comes
to children? I think most legislators just don't really know what
those things mean.
And so educating them in particular building coalitions of
civil society actors and multistakeholder actors can be very effect
tich in educating and influencing educators around the rights of
children. And then as was also mentioned in the chat, I think Omar
just put it in a few minutes ago, I believe including young people
in decision-making processes is not just essential, it's
empowering. I think that's an important part of the process too.
Bringing together legislators so the people who are actually
writing legislation and the children themselves is really
important. So that the legislation process can be child-centric
and really center the voices and experiences of the children we're
trying to serve. Last I think it's important to recognize this
needs to be done in an inclusive way and engaging children from all
different kinds of backgrounds so that all different experiences
are included as legislation is happening. But again I think
education really is at the core here.
Legislators want to hear from us. And are excited when we
raise our hands. Thank you.
>> ANANYA SINGH: Thank you very much. We will now be taking
questions from the online audience. May I request the online
moderator to kindly read out any questions or comments that we may
have received from the online audience?
>> So we have two questions from the online participants and
two comments. Question one is from Omar, who is a 17 yeefrld. He
asks how can initiatives be integrated into data governance and
sharing that children have a voice in po policies that directly
impact their digital lives. He's a founder and President of OMNA,
focussed on children's mental health and child's rights and he
wants to increase his impact on data governance for children.
Second question is from Paul Roberts from the U.K. and he asks when
it comes to tech companies designing products and services how
common is it for them to include in child rights design in their
process and at what stage? Proactive or afterthought for risk
minimization?
Comment one is also from Omar who said that he's from
Bangladesh and one of the eight nominees for international
children's rights for advocacy works. He's the founder and
President of project OMNAfshthsz and the youngest and only child
panelist of every ding tal impact session representing children
globally and providing statements on data protection and cyber
security for children. He suggested answer to a guiding questions
that you started the session with. That, one, (Man Hei Connie Siu)
he suggests the use of interactive -- adapting these tools to
different age levels. Two, is that to collaborate with tech
companies in order to develop age verification methods that employ
user created avatars or characters, safeguarding children's data.
Feedback will instrumental in refining this approach and three
establish child led digital councils or direct input into policy
decisions. These groups should meet regularly ensuring feedback
from children and aligning policies with evolving needs and digital
experiences.
The final comment is from Ying Chu, who said maybe the younger
generations know more about privacy protection and how to protect
their data than educators or us. They were born in the Internet
age and they are Internet kids. Many of us are Internet
immigration generation owe children's opinions are equally valid.
>> ANANYA SINGH: That's it. Any of these speakers or
panelists like to take the two questions that are on -- my
moderator just read out?
>> NJEMILE DAVIS: I can start by addressing the first one.
Omar you should apply to the digital youth council. We are
intending to have a third year and would love to see your
application.
One of the things that we tried to do there is to raise the
voice of youth advocates not just to the level of international
development organizations like U.S. aid, but to also empower them
to activate other youth networks. And in those efforts the level
of awareness-raising, helps to inspire and incentivizes solutions
that we have not thought of yet.
There's this constant tension between adults who have
authority to make decisions and children who understand what's best
for them but perhaps don't have the agency to do such.
And we tried to use this platform as a way to bridge that gap.
>> ANANYA SINGH: Okay. Are there any other comments from the
panelists? And since we are running short on time, I would
otherwise like to move to the next segment. Okay. We see
professor Livingstone has comments. I would request you to kindly
keep it short.
>> SONIA LIVINGSTONE: I'll be brief. I think everyone here
is convinced of the value of youth participation and rightly so. I
think the challenge is for those who haven't yet thought of it or
haven't yet embraced its value. So my answer to Omar and also to
Paul Roberts would be to talk more -- give more emphasis to child
rights impact assessments. I think many companies understand the
importance of impact assessment of all kinds and the child right's
iml packet assessment requires that youth participation as part of
its process along with gathering evidence and considering the full
range of children's rights. Perhaps it's more a mechanism in the
language of companies. And so one that if child rights are
embedded in the process, perhaps by requirement, I think that would
make many improvements.
>> ANANYA SINGH: Thank you professor Livingstone. As we
enter the final minutes of this enlightening session, I'm happy to
invite our speakers to share their invaluable recommendations in
less than a minute if possible. The question for all the panelists
is how can we involve children as active partners in the
development of data protection policies to ensure greater
intergenerational justice in laws, policies, strategies and
programs. Before I give the floor to our speakers I would also
like to strongly encourage the audience to seize this opportunity
and share their recommendations by scanning the QR code which is
displayed on the screen or by accessing the link shared in the chat
box. I would like to welcome professor Livingstone to share her
recommendation in less than a minute.
>> SONIA LIVINGSTONE: Thank you. I've mentioned child rights
impact assessment. Perhaps that is my really key recommendation.
I think that what we see over and again in youth -- child and youth
participation is that children's aen young people's views are
articulate, are significant, and are absolutely valid. The
challenge really is also for us who are adults. Every time we are
in a room or a meeting or process where we see no young people are
involved, we must point it out. We must call on those who are
organizing the event and that includes ourselves sometimes to point
out the obvious omission and to be ready to do the work to say
these are the optimal mechanisms and here is a way to start because
people find it hard but it really -- youth participation is
absolutely critical in this -- and of course young people's right.
>> ANANYA SINGH: Thank you. Edmon?
>> EDMON CHUNG: I will be very brief. I think a children's
IGF has called for. That's the beginning of this wider awareness.
And I think the -- it's about building the capacity as well, right?
You can't just throw children into the focus group for two hours
and expect them to come up with a brilliant policy decision, right?
So it's a long-term thing. It starts with actually the Internet
governance community and all these things that actually you have
children as part of a stakeholder group and you know, that I think
is probably a good way to go about it.
>> ANANYA SINGH: And Njemile?
>> NJEMILE DAVIS: Thank you. We need to do a better job
discussing digital and data rights in formal education
institutions. I think we can do a much better job of that
globally. So that there's a welcome, encouraging environment to
hear children, how they would like to advance their digital
identities. In a digital society they have awareness, they have
tools and they have opportunities to do so. In safe ways with
mentorship and guidance.
>> ANANYA SINGH: Emma?
>> EMMA DAY: I would like to emphasize children are not a
homogenous group. It's important to send the most vulnerable and
marginalized children within a country or geographically
considering global reach that a lot of apps and platforms have
these days. There's a particularly great scholar I would recommend
reading up on, Rigo's work from the design from the margins, she
auks about if product the are designed for the educators for risky
scenarios, until the end it's going to benefit much move. I'm
going to share that in the chat. Thanks.
>> ANANYA SINGH: Thank you. Finally, Theodora?
>> THEODORA SKEADAS: It's worth mentioning again we need to
be centering the voices of active participants in conversations
about their well being. So this can be done by including them in
surveys, focus groups, workshops, various methods that are
children-friendly. In the legislative process I think that
children should be empowered to advocate for themselves.
Specifically older children. But children from all different
backgrounds. This is will well being at stake. When it comes to
companies, I would like to see children represented on these
advisory boards. That hasn't traditionally happened. And I put a
few of advisory boards in the chat because thee are ways to elevate
the voices of children directly in conversation with the people
making policies for the platforms. Thank you.
>> ANANYA SINGH: Thank you very much. Ladies and gentlemen,
as we come to the end of this enlightening session I would like to
express my heartfelt gratitude to our distinguished speakers for
their unwavering commitment to sharing their knowledge and
expertise and for making our lives easier as moderators because I
see you have been responding to the comments and questions in the
chat box.
I will also like to extend my deepest appreciation to the very
active audience for their extremely energeticing engagement and
thoughtful participation. Without your participation this session
would not have been as meaningful. On the subject of people who
have been instrumental in making this session a success, my
teammates the tal lepted co-organizers of all the four workshops we
have oeft hosted during the UN IGF, 2023. Neli from Georgia, I
cannot thank you both for your exemplary commitment, hard work,
inspiring creativity and tireless efforts. In the absence of all
of which we would not have been able to create impact we have. I
want everyone here in attendance to be aware of appreciate the
accomplishmentes and personal sacrifices. The team has made you
keep this ship afloat. It was my good fortune indeed to have had
the honor of leading this exceptional team. So thank you once
again for making this happen.
As we conclude this session, I urge all of us to kindly
reflect on the insights we have gained and the recommendations put
forth. Let us not let this be just another event or seminar, but
rather a catalyst for action. It is up to each of us to take the
lessons learned today and apply them in our respective fields,
organizations and communities. Together we can create a better
world for ourselves and future generations. And we are right on
time. Good-bye. Thank you.