Session
Roundtable
Duration (minutes): 60
Format description: As the discussion was conducted around a central theme, it was possible to present a comprehensive and multi-angle presentation of the responsibilities and challenges of Internet companies in child online protection in a short period of time. Roundtable discussions encourage participants to actively participate, ask questions and answer, and this interaction not only makes the discussion more lively, but also stimulates participants to think deeply. Through in-depth discussions and exchanges, participants can more easily reach a consensus on the responsibility of Internet companies in the protection of children online, and then promote the implementation of relevant policies and measures. Therefore, the 60-minute round-table seminar on the responsibility of Internet companies to protect children online has many advantages, which can help promote exchanges and cooperation among participants and promote in-depth research and resolution of relevant issues.
The purpose of this forum is to explore how tech companies can effectively fulfill their responsibilities for protecting children online while pursuing technological innovation. In order to promote responsible technological innovation for serving children, various levels of government in China have introduced multiple laws and regulations for the protection of minors online, such as the Law on Protection of Personal Information, the ”online protection” section of the Law on Protection of Minors and the Regulations on the Protection of Minors Online. In 2021, UNICEF released the 2.0 version of its global policy guide "Artificial Intelligence for Children - Policy Guidelines," which includes responsible artificial intelligence principles for children. How to promote tech companies implementing the responsibility of protecting children online in technological innovation is the main focus of this forum. The forum plans to invite policymakers, industry practitioners, academic researchers, and social workers in the field of child online protection to engage in a dialogue. By fully integrating industry, academia, research, and social opinions, the forum aims to provide recommendations for responsible technological innovation for children. Firstly, the forum will mainly discuss the following three aspects: (1) How can tech companies ensure that their technological innovations do not have negative impacts on children's online safety and privacy while pursuing technological innovation? This involves designing safer and more child-friendly tech products and services, as well as fully considering the special needs and rights of children in the innovation process. (2) How should tech companies formulate and implement effective policies and measures for child online protection? This includes developing clear child online protection policies, establishing dedicated child online protection teams, and ensuring the safety and health of children when using internet products through technical means and manual review. (3) How to raise awareness of child online protection among tech companies and the general public? This includes enhancing tech companies' awareness and sense of responsibility for child online protection through publicity and education activities, as well as increasing public attention and participation in child online protection issues to create a safer and healthier online environment together. Secondly, challenges and opportunities related to the topic of this forum are as follows: On one hand, with the rapid development of technology, child online protection needs to continuously adapt to new threats and vulnerabilities. This requires tech companies to not only focus on technological innovation but also strengthen research and defense against emerging online threats. Tech companies may also consider adopting international standards and principles such as Responsible Innovation in Technology for Children (RITEC) or Safety-by-Design when pursuing children’s online safety. Currently, laws, regulations, and policies related to child online protection need to quickly adapt to the pace of technological development. This requires tech companies to actively cooperate with governments, social organizations, etc., to promote the improvement of relevant regulations and policies while innovating. Improving parents' and children's own awareness of online security is a long-term and challenging task. Tech companies need to consider user education in product design and conduct public welfare activities to raise public awareness of online security. On the other hand, technological innovation provides new solutions for child online protection. For example, through technologies such as artificial intelligence, bearing in mind the principles of the UNICEF Policy Guidance on AI for Children, it is possible to more effectively identify and prevent the spread of harmful information, thus protecting children's online safety. As global attention to child online protection increases, governments and private sector will invest more resources in supporting the research and application of relevant technologies to ensure safe technology products and access to Internet. This provides tech companies with tremendous market opportunities as it shows companies‘ actively fulfilling the responsibility of child online protection helps to enhance their sense of social responsibility and brand image. This not only strengthens the competitiveness of enterprises but also attracts more socially responsible users and partners. In summary, tech companies face both challenges and opportunities in fulfilling their responsibility for child online protection through technological innovation. By overcoming challenges in technology, regulations, and user’s awareness, while seizing opportunities in technological innovation, policy support, and social responsibility, companies can play a greater role in protecting children's online safety while achieving their own sustainable development. Agenda (60 Min): 1. (2 Min) The moderator introduced the theme and purpose of the forum and introduced the participants. 2. (8 Min) Organizers or child protection experts share the case call for responsible technology innovations for children that UNICEF and CFIS launched this year. 3. (20 Min) The onsite speakers share their views. Topics to be addressed include but are not limited to: (1) The balance between scientific and technological innovation and children's online protection. (2) Formulate and implement effective child online protection policies and measures. (3) Raise the awareness of Internet enterprises and the public on children's online protection. 4. (10 Min) Online speakers share their views. Topics to be addressed include but are not limited to: (1) The experience and measures of well-known Internet companies in different countries on child online protection. (2) What good policies, regulations or standards exist in different countries to promote responsible scientific and technological innovation for children. 5. (10Min) Representative Internet companies share their experiences and practices in responsible technology innovation for children. 6. (8 Min) Open discussion and Q&A: all participants onsite and online will have the opportunity to ask questions and discuss their viewpoints, and speakers will answer these questions. 7. (2 Min) The moderator will deliver the closing remarks.
(1) As a hybrid forum, there are both onsite speakers and attendees, as well as online speakers and attendees. In order to attract more attendees, we will invite and encourage people to participate online through Zoom meeting software. The onsite and online moderators will closely cooperate and be responsible for activating the atmosphere of onsite and online discussions, respectively. The online moderator will collect online questions in a timely manner and convey them to the onsite moderator to ensure smooth communication between the onsite and online attendees. (2) Multiple volunteers will be arranged onsite for video filming to ensure that each speaker's speech video can be live streamed online through Zoom. The moderator and speakers onsite can see the online participants' questions in real-time through the LED screen on site. The online moderator interacts with online attendees, promptly pushing meaningful questions through the screen to the moderator or speakers onsite, and strictly controlling the speaking time to ensure the participation of each speaker and the progress of the forum. In addition, we will promote the forum in advance so that online participants can prepare questions and relevant materials in advance, which may facilitate more interesting discussions onsite. (3) The organizer will design and create graphic and textual links, promotional posters, etc. around the theme of the forum, and spread them through social media platforms such as WeChat, Twitter, and Facebook to attract more attendees and stimulate everyone's thinking in advance, creating an atmosphere of joint participation.
🔒UNICEF China
Organizer 1: Rui Li, UNICEF China, Intergovernmental Organization Organizer 2: Shenrui Li, UNICEF China, Intergovernmental Organization Organizer 3: Xiuyun Ding, China Federation of Internet Societies, Civil Society, Asia-Pacific Group Organizer 4: Yiran Xing, China Federation of Internet Societies, Civil Society, Asia-Pacific Group Organizer 5: Ming Yan, Communication University of China, Civil Society, Asia-Pacific Group Organizer 6: Meiqi Luo, Communication University of China, Civil Society, Asia-Pacific Group (Online) Organizer 7: Gan Lu, Child Law International Alliance,Civil Society, Eastern European Group, [email protected]
1. Hui Zhao, China Federation of Internet Societies, Civil Society, Asia-Pacific Group 2. Afrooz Kaviani Johnson Global Lead of Child Online Protection UNICEF HQ 3. Dora Giusti, UNICEF China, Intergovernmental Organization, 4.Tong Lihua, President of Child Law International Alliance 5. Joy Katunge, Kenya Alliance For Advancement Of Children, Legal & Policy Advocator on Child Online Protection,Africa Group 6. André F. Gygax, the University of Melbourne, Civil Society (online) 7.Zhang Lei, Tencent, Private sector, Asia-Pacific Group 8. Zhao Mengxi,Baidu,Private sector, Asia-Pacific Group
Shenrui Li, UNICEF China, Intergovernmental Organization
Xiuyun Ding, China Federation of Internet Societies, Civil Society, Asia-Pacific Group
Ming Yan, Communication University of China, Civil Society, Asia-Pacific Group
Targets: First, the responsibility of Internet companies to protect children online is directly linked to SDG 16.2's goal of promoting just, peaceful and inclusive societies. Protecting children from online threats and abuse, and ensuring their safety and dignity in cyberspace, is an essential part of achieving a just society. By formulating and implementing effective child online protection policies and measures, Internet companies can promote a more just social order and create a safer and more harmonious online environment for children. (SDG 16.2) Second, while fulfilling their responsibility to protect children online, Internet companies also need to focus on digital inclusion and ensure that all children have equal access to and use of the Internet and related services. Through innovative technological means, Internet companies can provide more user-friendly and easy-to-use Internet products and services for children, reduce the digital divide, and promote the realization of infrastructure and industrial innovation in SDG9.2. (SDG9.2) Finally, as an important part of society, Internet enterprises should actively fulfill their social responsibilities, including playing an active role in the protection of children online. By raising the awareness of Internet enterprises and the public on child online protection, and promoting Internet enterprises to adopt more responsible behaviors, it can promote the realization of responsible consumption and production in SDG12.8. This will not only help protect children's Internet safety, but also promote the attention of the whole society to the issue of Internet safety and promote the sustainable development of cyberspace. (SDG12.8)
Report
(1) The use of these technologies helps provide more well-being for minors, such as health monitoring of minors, recommendation of quality content, and accompaniment of special groups. However, these emerging smart technologies also pose many risks to minors, such as unfairness, data privacy security, and online addiction.
(2) Research indicates that parents' digital literacy can influence children's perspectives on online activities. When parents serve as exemplary models in terms of Internet usage and possess the ability to discern online information, children are more likely to perceive the Internet as a tool for learning and personal development.
(3) Children’s rights in relation to the digital environment are indivisible, interdependent, and interrelated.Children have unique vulnerabilities compared to adults. However, we must acknowledge and support children’s agency and resilience and give due weight to children’s views.Children do not constitute a homogenous group but must be considered in their full diversity.
(1) All businesses that target children, have children as end users, or otherwise affect children have a responsibility to respect children’s rights in the digital environment.
(2) Balancing technological innovation with the responsibility to protect children. It is imperative that tech companies balance their pursuit of technological innovation with a steadfast commitment to protecting children. By adopting responsible practices and integrating safety-by-design principles, tech companies can create digital spaces that are not only innovative but also safe and supportive for children.