IGF 2023 – Day 0 – Governing Data: What Can Parliamentarians Do To Support A Trustworthy Online Space? – RAW

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> LILY EDINAM BOTSYOE:  Our session starts soon.  May I request our speakers to join me on stage.  Welcome, everyone. 

    Distinguished attendees. 

    (Speaking non‑English language) online audience.  I'm here in person and excited you get an opportunity to join us today.  So we have Parliamentarians today who will help us to answer the question, what can Parliamentarians do to support a trust worthy online space? 

    My name is Lily Edinam Botsyoe.  I'm excited to be moderating the session.  Our biographies of our speakers and launch the conversation.  For those online, there will be times we will allow the floor for you to send in your comment and your contributions. 

    And in this room we have two microphones available.  So when the floor's open, please go to the microphone and ask your questions.  Our Parliamentarians will respond to you. Again, you're welcome. 

    Speakers today, we have Tobias Bacherle who is an office director.  He has a longstanding partnership with the African Union and the AUD and specialized agencies with a goal of advanced and sustainable development of African continents with the AU agenda 2063. 

    We also have Alison Gilwald who will issue joining us pretty soon.  She's the Executive Director and African based in.  We have Mr. ‑‑ hopefully I didn't butcher the name.  He is a member of Parliament for the German on behalf of the player group alliance 90.  And we also have madam Laura Tresca who is part of the Brazilian Internet Steering Committee.  She's a journalist, a social scientist with expertise in digital rights and government affairs. 

    So our speakers are a good blend of Parliamentarians and also people who are.  We will be kicking off the conversation soon and have questions.  Before that, an opening remark from Mr. Tobias dooel.  The floor is yours.

>> Thank you for the warm words of welcome and good morning from my side.  Honourable members of Parliament, esteemed delegates.  It's a great honor and pleasure to have the opportunity of welcoming today to the third Parliamentary Track of the Internet Governance Forum.  On behalf of the German development agency, I would like also to seize the opportunity to thank our hosts, the government and people of Japan for the absolutely wonderful hospitality here in the historic city of Kyoto. 

    I'm very excited to see so many Parliamentarians hearing aids from across the globe are represented.  I'm grateful to see you engage in inter‑Parliamentary dialogue to advance our co‑operation on key policy issues of the digital sphere. 

    In 2019, when the UN IGF was hosted by the Government of Germany in the city of Berlin, the initiative to establish Parliamentary roundtable was first formed.  This idea was then continued in 2020. 

    The idea was further advanced in 2021 during the IGF in Katowice and in in 2022 when the Parliamentary roundtable turned into an extended Parliamentary Track that encompassed on one hand in the activities that lead up to the annual meeting but also during it. 

    And I believe that the fact that we see this now continued in the third consecutive track now proves to me that this initiative is actually bearing fruits.  And as director of the GZ African office, I'm particularly happy to see that there are so many African members of Parliament represented here.  This is also because I believe that legislators play an absolutely crucial role when it comes to shaping a free, open, and secure internet for all. 

    And with this Parliamentary Track and really important format has been established which highlights the role that Parliamentarians play as custodians of the digital realm. 

    So when we see that these discussions are taken from the national and regional level to this ‑‑ to the global stage, I think through this we will be able to really find key solutions that are beneficial to everyone.  Especially when we look at the content today, the first topic that you will discuss today which is the governance of data, this is currently very much at the core of G zed's work of the African Union.  Because we support Member States of the AU towards harmonizing data regulations across the continent. 

    By this means enabling free flow of data but also trust between countries in order to fully leverage the economic and social opportunities of digital transformation. 

    That being said, ladies and gentlemen, I would much like to take more of your time, because I'm very much looking forward to the discussion here of the panel today.  And I'm eagerly looking forward to coming up with some tangible conclusions during this week.  And we can all benefit from the diversity of perspectives that are present here in this room. 

    And I really hope that the outcomes of this discussions will be suitable in a format so that you can take them home to your National Parliaments and put them into action. 

    As GI zed we continue with the Parliamentary Track and we're doing this on behalf of Germany.  I wish you great success, inspiration, and some fruitful discussions here today. 

    Thank you so much P.

>> LILY EDINAM BOTSYOE:  Thank you so much.  Round of applause for Mr. Tobias.  Legislators play a crucial role in ensuring a safe and open internet for everyone.  This builds on the Parliamentary Track and the IGF plans the particular through intersessional activities fostering inter‑Parliamentary dialogue on important issues. 

    This topic is about the role of Parliamentarians and shaping digital trust as we witness an increase of lack of trust around privacy, security and human rights protection.  So we kickstart the session with our first round of questions. 

    I'm going to start with madam Laura Tresca who is part of the Brazilian Internet Steering Committee.  Legislators are key actors developing the legislation that contribute the trust worthy digital space.  How can government and industry collect data with principles and standards and practices that protect data and the right to privacy?  The floor is yours. 

>> LAURA TRESCA:  Thank you so much for the invitation to take part in this panel.  It's really a pleasure to discuss this topic with you. 

    To respond to this question, I want to bring the Brazilian experience on the data protection law.  The process of developing and passing passing the data protection law in Brazil was marked by various stages of consultation engagement with stakeholders following a multistakeholder approach. 

    The process began with online public consultation to gather input from various stakeholders, including government agents, Civil Society organisations, industry representatives and the general public. 

    In the first phase there was just one main question.  What are the topics, issues that a data protection law should address? 

    The comments were transformed into a draft.  And that was submitted to online public consultation again.  After some time of open comment and a new version of the draft view was made and, again, it was made public consultation.  So three public online consultation. 

    These consultations allowed interested parties to provide feedback on the draft data protection law.  And when the bill was introduced in the Brazilian Congress, there was a debate on this topic. 

    In the Brazilian Congress it was also an innovative process.  The ‑‑ the deputy, he could follow the legislative pathway, but he had made a different approach, a different choice. 

    He set up a multistakeholder roundtable and together with the several stakeholder representatives, he had ‑‑ he read the bill topic by topic.  And negotiated and discussing all together the bill text. 

    It took several rounds of debates and scrutiny.  It was ‑‑ this input was instrumental to shape the final version of the (?) our data protection law. 

    After this pathway, it was for the Brazilian Congress to approve the law. 

    The participatory process in Brazil data protection law development emphasized the importance of engaging a wide range of stakeholders, including Civil Society industry and government in shaping the legislation.  Public consultations and the roundtable allowed transparency, accountability, and the incorporation of diverse perspectives into the law.  This participatory approach is in line with the approach of multistakeholders and the Brazil's commitment to address the data protection and the privacy concerns in an inclusive manner.  Thank you. 

>> LILY EDINAM BOTSYOE:  Thank you so much.  So we see the Brazilian perspective of a multistakeholder approach to asking what a data protection law should entail and opening it up to public consultations and then the resulting of the floor.  That's an interesting. 

    I will ask honorable Tobias the same thing.  How do you support the building of principals, policies and standards to protecting data and the rights of people. 

>> TOBIAS BACHERLE:  First of all, it's very important to identify what can be principles that we want to push forward.  I think one principle that's very important if we are talking about data legislation is give the power back to the individual, have a human‑centric approach.  It sounds very, very basic.  Probably what drives us in our policy making is a very human sent Rick approach. 

    If you're talking about data policies, I would not say humans but rather users because we have different kinds of users.  Sometimes we have individuals that are companies.  Sometimes we have individuals that are actually humans.  Let's talk about users and give those users more control.  Because right now many, many people believe ‑‑ I think it's almost half of the people using the internet believe they don't have any control over the data they're creating online. 

    And why is that a problem?  We're coming into an age of AI, of data driven fact‑based decision making.  That's a good thing.  But it's only a good thing if the data is valid, if there are not too many biases on there if we have good data qualities. 

    And to get those goods data qualities we need people to share their data.  Users to share their data.  And they will only do that if they trust you, us, and governments to share their data but also, of course, companies to share the data. 

    So this is, I think, very important to understand why do we want people to trust us, whether it's as governments or companies, to share their data with us?  Why do we need a data policy framework in the first place? 

    If you go to the next step and we decide we do want to get the trust of the people, of the users to share their data, we need to give them back the power, the control, but also transparency.  And I think that's very important to understand.  Sometimes not the first thing you want to do as a legislator or a state to give away power.  But I think it's very, very important. 

    The control over what data is used for what use, for what circumstance, for what approach needs to lie with the user.  And they need to trust you, legislation, but also State to fulfill this promise, that the data is also only used for the purpose they agreed on. 

    Now, the internet is a very, very wild and broad place.  So when we try to have a very, very similar approach with the cookie use in Europe that went, well, not as we planned.  Now in Europe you have to click on every website whether you agree to the use of cookies ‑‑ you don't really know what you agree on.  If you're a bit more into the topic, you maybe scroll down and decide not everything, but there are so many things I can't decide on ‑‑ well, let's share nothing or let's share everything.  But probably most people don't really look into it, what they could share. 

    So what we are missing right now is someone who is sitting in between us and the decision.  Because I can't make the decision for every single data point I leave behind on the internet. 

    So we need to establish a framework that gives us the possibility to make an overall decision for what I want to be able to share my data for what purpose and to have someone to control that.  And for us, a State to control those who are controlling that. 

    If we go that way, I think we have a bit more time to go into depth into that later.  But give the transparency as well, so have a data where citizens and users are able to see what actually happened with my data?  Who asked for the data?  Who got the data?  And what did they do with that?  Then I think we're on a really good pathway. 

>> LILY EDINAM BOTSYOE:  Absolutely.  One of the key points for me that caught my attention was the human sent Rick approach.  The question of the what, the why should lie with ‑‑ with the user, not just humans.  What should a user know?  The transparency that these companies, organisations give and also the regulation the government provides.  Some say that organisations can see what they're collecting the data for and what purpose and at what expense.  So the user is able to see all of this and understand. 

    So if you're just joining us, we're talking about governing data and asking what is the role of legislators in ensuring a trust worthy internet space with a multistakeholder approach to ask what should be in a data protection role.  The consultations have gone on and from the German perspective that the idea of the human or the user should be able to know and understand some of the reasons why this data is being collected and for legislators to help in regulating when it comes to organisations and what it is they collect. 

    We're moving to the next part of the conversation.  But I'll open the floor soon enough for everyone in the room to be able to join. 

    I see we have one of our speakers come in and join us on the stage.  She'll also contribute to the conversation.  The second question that we want to delve into is how can governments and industry collect data with principles and standards for policies and practices that protect data and the right to privacy?  So we know what these principles are from honourable Tobias and from the Brazilian context.  We want to see how it's collected with standards.  I'll start with you again, madam Laura. 

>> LAURA TRESCA:  Thank you.  I think that when an innovative approach that holds promise in achieving this balance is the concept of (?) sandbox.  The regulatory sandboxes are controlled environments.  When you achieve that data driven product can be tested and with flexibility and regulatory compliance.  The sandbox creates a space where governments, industry, and regulators can work collaborative to foster innovation while ensuring that the protection and right to privacy remain the paramount. 

    I would think that international regulatory sandbox, because we have the international data flows.  And sometimes we approve one law in a jurisdiction and there are other jurisdictions. 

    So with this model the regulatory authorities closely monitor and supervise the project within the sandbox.  This oversight can ensure that the end data collected is used for (?) and in accordance with the privacy regulations and maybe we can avoid the problems that Tobias mentioned before. 

>> LILY EDINAM BOTSYOE:  Absolutely.  That is an interesting one.  We know we try ‑‑ does it improve?  I like the idea of this international regulatory sandbox.  Because of different jurisdictions we're in we can test out something to see if it works.  To test that out is very important.  It may seem impossible at the beginning.  How do we test it out. 

    The sandbox gives the opportunity to fail forward, get feedback, and to move on.  That is a way of collecting data, making it by things with regulations that are accepted to a large extent across the world.  So that's an interesting perspective. 

    I'm going to come to you also, honourable Tobias. 

>> TOBIAS BACHERLE:  I already mentioned that trust and transparency, I think, are Keystones and should be the approach we are going for.  And how can we achieve it.  First of all, I started with transparency.  I already mentioned data privacy and in the upcoming organisation of our public services.  But Estonia established that and it's very, very helpful.  You log in.  You actually see who asked to see my data, who used it, and in certain instances, if they don't have permission to get the data, I'll get a request whether I want to share the data in the first place. 

    This is very, very helpful.  So if I go to a doctor, I see the first doctor.  I go to a second doctor, want a second opinion, it's helpful if I see in my data cockpit whether the second doctor was reading the report of the first doctor and if they're give the same words and same ideas, I realize make he just got it from the first doctor.  I don't want to overwhelm our medicine and doctors.  I trust them a lot.  It's sometimes good to see who looked at the report in the first place.  I think this is a very good part. 

    The other part is being able to see your data history.  We that in the GDPR.  For example, if you're on Facebook, you can ask to give up every data they collected from you.  That's a lot.  You don't really want to see that.  Sometimes it's good idea to see what did I share?  What is it we're actually talking about if they're asking them whether I want them to allow them to share with somebody else.  That's on transparency. 

    The second part is data trustees.  If I want to decide my medicine data can be shared with everyone who is working on certain medicines, I want it only to be shared with those who are working on certain medicines, not with the government.  Because we need to understand data is the base of knowledge and knowledge is power.  Therefore, knowledge about someone can be power over someone. 

    So I need someone in the middle of me and those who want to use the data to control whether they are actually want to use the data for the purpose I'm willing to let them use my data. 

    In the European Union that didn't really work out until now.  Because we don't have that kind of data market yet.  So what we try ‑‑ or what I'm hopeful might be happening now, after we decided we're having the Data Act which is mostly about data that is by machines and production, that we are getting a market where data trustees have a bigger market, more data to actually control.  And therefore to sell but not necessarily sell to the highest bidder, but rather to those who are allowed to use the data, because the user was saying in that case you're allowed to use it. 

    So that's on the data market sphere.  Now, if we are talking about collecting data as a State, I come back to the first part, it's very important that we are really only using it for what we promise we're going to use it. 

    And one idea I'm very fond of is to not have a single identifier.  Because we have several identifiers right now.  But in the digital age we're pushing a single identifier system.  So we have one (?) which is used by the State to identify myself in every case, which sometimes makes sense.  But it's also ‑‑ it also makes it very easy to connect certain data points that are maybe stored with different entities. 

    So to think that idea through, to have some registers, for example, medical files that have a certain identifier.  But my communication with the central state has another identifier.  And we're talking about that technically, not giving your citizens seven different apps they have to log in, seven different passwords.  Just accuse a key pass basically that gives them one app which is a key pass.  But if someone decides I want to look up what this single identifier of a certain medicine file did communicate with the state or did communicate with certain banks can easily connect those data points. 

    This is a rather technical approach to give really secure net against misuse by the State, by hackers, or other third parties. 

    Last but not least, the open data approach, I believe those data we held a State.  We can't go around preaching everyone shouldn't be sharing their data and sitting on a lot of data but don't share it with the Civil Society, with companies who might take it to use.  I think that's very, very important.  Because going around preaching something and not doing it ourselves is always a very bad idea. 

>> LILY EDINAM BOTSYOE:  Thank you so much.  So we saw that data is a base for knowledge and knowledge is power.  So without regulation there's a part of ‑‑ for an everyday user.  Allow the transparency of the data trustees, also the issue with single identifiers and open data. 

    If you just joined us, we are talking about governing data what role legislators have or can play to ensure there is a trust worthy online space.  We've heard perspectives from honourable Tobias from Germany and madam Laura from Brazil. 

    Now I want to hear the perspective of Ms. Alison Gilwald who is an Executive Director of ICT Africa.  I want to ask you the question about how governments and industry collect data with principles and standards for policies and how they can protect the data and right to privacy of users. 

>> ALISON GILWALD:  Thanks very much, lily.  Apologies for my late arrival.  Thanks very much.  I certainly want to share with you some of the context for that kind of government and industry role in collecting and protecting data.  I did want to do it within the context of the recently passed of the African Union Data Policy Framework, because I think this provides a very enabling environment for more positive use of data. 

    And I mention that because the Data Policy Framework is a very substantive, large scope governance framework.  It's not just data protection.  We've had data protection laws for some time, a Convention that recently became operational on the data protection cybersecurity.  But the data governance ‑‑ the Data Policy Framework is really an enabling framework that was passed by Member States in February.  It's a high‑level guidance document so we can get on to the page quickly.  It acknowledges the principles of progressive realization of a set of human rights preserving, but also very strongly ‑‑ of course, binding.  Enabling an environment that is seeking not only to protect data but to ensure that that data is available as a member was mentioning for broader use.  Not only public data but at the moment commitments from government for open data, acknowledging that in fact with many, many people offline on the continent, a lot of the benefits of this are not actually available to many people. 

    So unlike the other data governance frameworks which assume enabling legal framework, rule of law, human rights, various things, these are progressively realized in terms of the objectives.  And the idea is that you would in time get the kind ‑‑ the full governance environment that you would require that would not only kind of have individualised informed consent notions, even those people who are online can barely enable, but you actually create conditions through a series of foundational infrastructures that are required, unlike other data governance frameworks, it has these foundational requirements that countries need to be digitally ready for to relationship the benefits of data value creation on the continent. 

    Those include your underlying broadband environment and data infrastructure and of course your digital identification which has been the subject of so much of in oh discussion while enabling a necessary how that is done.  We know from all over the worlds it's important in terms of ensuring, firstly, access to those identity systems but limiting surveillance and those kinds of things. 

    So this enabling environment, this legislative environment is absolutely essential as we point out in the framework for getting the levels of trust that are required for a successful data environment, data ecosystem. 

    So it's not only about security and about data protection, which is a traditional checkboxes and then you've got trust.  No, in fact you don't.  Without high levels of legitimacy in the system you don't get that trust.  Quite a lot of emphasis on creating that enabling environment that would create that trusted environment to allow people, not only to be protected from with the harms but also to redress ‑‑ I think that's quite an interesting difference between this framework and others are the issues of enablement and also the issues of redress of the uneven distribution of opportunities that go with this as well. 

    So very much in the context of the digital and the 2063 but also the digital single market and of course the African continental free trade area which unless countries get the underpinnings in place they're not going to the beneficiaries of. 

    It's a voluminous document and Haas a lot of the preparatory work that has oh to be done, the preconditions and an extensive data governance section which starts with data access, data governance, and controlling opportunity there.  Thank you. 

>> LILY EDINAM BOTSYOE:  Thank you so much.  Thanks for sharing similar issues across the country but different ways to address them.  I like the mention of the AU Data Policy Framework, human rights preserving to ensure the use of data and also how data ‑‑ the one continent in a single market and all the other things that come with the continental free trade area. 

    Now, I have an idea, because you mentioned also single market.  Is there something similar AU Data Policy Framework that is existing for Europe maybe? 

>> TOBIAS BACHERLE:  Well, we're kind of reshaping the digital regulation right now, not necessarily with one singular act.  We have the GDPR for the data privacy regulation.  We now have the Data Governance Act and the Data Act and we're currently working on the AI Act.  Actually we're missing the privacy guideline.  We are going to miss it for awhile in the European Union. 

    With that new framework or digital regulation framework ‑‑ and I forgot the digital services act and digital market act this is basically the approach to reshape digital regulation within Europe until next year if the AI Act works out to be negotiated within the next few months, which is the plan. 

>> LILY EDINAM BOTSYOE:  Right.  Thank you so much.  I'm going to have the floor open and also for our online participants there's also the chat.  And for people who want to speak, please approach the microphones.  I'll give you the floor to speak.  We will come to the speakers to have a one minute for round up and a summary of what's been discussed.  For anyone who wants to speak in the room, please approach the microphone, and I'll give you the floor to speak.  You can line up on the left and on the right and have the questions come in or even contributions.  Also, online folks, let me know about the online session if there are any question.  The floor is yours. 

>> AUDIENCE:  Thank you very much for this extremely important topic.  Actually there is always two sides of the coin.  So we are talking about the flow of data to be beneficiary for several countries like what we are doing in the AU Data Policy Framework.  But, again, we have a lot of countries who do not have until now a Data Act.  So they don't have freedom of information within the country itself.  So if you're asking to have this within other ‑‑ between different countries, I guess we are talking about science fiction right now. 

    At the same time if you're talking about the sandbox and we need to control somehow the data, the personal data for people, we are in a universe that is mostly governed by multinationals and big corporations.  So I don't know if we can actually control our data from being used as commodity, which is happening right now, and nobody is able to control this. 

    I don't know if you are in the EU are able to prevent these corporates from using the personal data for your citizens and selling them or targeting those citizens with the ads that they don't want to see.  I think this is everywhere.  It's not just in our continent.  It's all over the place. 

    So it's completely complicated.  And I don't know how we can get out of this, how we can actually have the data to be beneficiary to all for the, of course, the medical institutions and for research but not to be used against the people.  It's just a question, and I feel that we are in a big mess everywhere.  Thank you. 

>> LILY EDINAM BOTSYOE:  Thank you so much.  Madam Alison, and I'll come to you and then to you. 

>> ALISON GILWALD:  Thank you so much.  I think we are in a big mess.  I think for the first time there's some sense of pulling some strings and getting some frameworks and things.  I think before we saw this European Union kind of push back against no regulation or sort of self‑regulation, so‑called, I think people were saying, there's nothing we can do.  We've just got to live with it.  It's part of a new reality.  I think there are things we can do. 

    I think there's firstly pressure being put by these bigger markets on these players so they have to engage on certain terms.  I think the privacy issues have longed used ‑‑ they've implemented strong privacy provisions because that's what their business model is.  They've resisted during COVID when it's actually being collecting requirement for data.  It's been withheld on the grounds of privacy.  You could provide that in an anonymized way.  I think those frameworks are emerging mainly from the European Union's regulation of that market that other markets have been able to make the same kind of conditions. 

    I think, although it's very difficult to regulate the over the top activity happening over the continent, we're able to regulate elements underlying that.  Issues around access to data are as important as ensuring the privacy issues.  I think there's work that has been done.  I think there's an amount of success around individualised notions of privacy. 

    Actually, there are other collective interests that ‑‑ you made the very important point of balancing the privacy request with access to information and freedom of information.  I think many of our jurisdictions don't ‑‑ they've taken the GDPR data protection and cloned it and very often without ‑‑ it's already there in the European June but in many of our countries, in South Africa we do have the data reg aortas also access to information.  I think this is a very important balance.  Because where we can do some regulation is around access to information and actually ensuring that individuals have the sovereignty of their data so she can get back data, they can port their data to other competitors and that kind of thing.  I think there's competition regulation and economic regulation underpinning that that we can make those demands. 

    I want to say, lastly, that Africa really needs to piggyback on some of the regulation before the European Union did it, we thought it wasn't possible.  For example, just the access to information that's now being required by the digital services act.  For example, for researchers, they can get access to big tech data.  Now only applies to European researchers, Northern Hemisphere University of Michigan and other researchers.  It's setting up quay qualities because African researchers do not have access to the data and they're obliged to partner and those kind of things.  We've got to get it right in the continent too.  It's that the harmisation that gives us the scope and scale that allow us to be bigger players in that global market. 

>> LILY EDINAM BOTSYOE:  One more on the question and back to you. 

>> TOBIAS BACHERLE:  You asked whether we are able to enforce all we want to enforce in Europe.  I would say no and we're also in a big mess.  We are to a certain degree.  I think it's important as legislators we don't back down before big tech.  We don't have to.  Of course, we would like to have way more transparency in the algorithms, way more transparency also in the data training sets of AI models. 

    But this is a political discussion.  But we give this now to researchers and the big companies or all companies ‑‑ very large online platforms they need to do an assessment for themselves.  So if they have discriminatory algorithms that in this assessment are found, they need to take actions.  Of course, in the end they are going to be fined if they don't comply with that. 

    So, yes, it's not as fast as we wish.  Yes, it's not as transparent as we wish.  And yes, we are not as good in enforcing as we wish to.  But we are starting and we're seeing a bit of change, for example, threats from ‑‑ it's not available in Europe.  No one really cares.  So, yeah.  But with TikTok people would care in the US.  So they had to comply.  Of course, both sides were in a strong point of negotiation.  The US government because TikTok didn't want to lose that important market.  TikTok because it's a huge platform with Gen Z.  But in the end, they had to comply.  And they did what, well, what we're asking from American companies sometimes.  They had to build a data centre and they basically had to put American stuff in the data centre. 

    So, yes, it is possible. 

>> LILY EDINAM BOTSYOE:  Thank you for that intervention. 

>> LAURA TRESCA:  I just want to comment that the Brazilian Steering Committee looked at this issue and made the public consultation on this topic about platform relation.  And I think the point was exactly that to look at the power of these companies and how to balance with the fundamental rights.  And I believe that international sandbox ‑‑ they can be used not just for enable new digital markets or digital business or enforce this power but also to put the human‑centric approach in the middle of the history. 

>> LILY EDINAM BOTSYOE:  Thank you so much for your responses.  And now to ask from the floor. 

>> AUDIENCE:  Thank you.  For me I think I wanted to zero in on what Tobias was emphasizing that the policy decision making process has got to be focused on the end user.  Therefore, it can be the business.  It can be the human beings. 

    But are we also going to talk about the capacity building, the policymakers?  Because the assumption is that the people that are discussing these critical issues do have the understanding, the deep understanding to be able to table these things and put the user at the centre of it.  Can we have that conversation also? 

    I would like to hear how the Europeans are looking at this issue of building capacity.  Because we can't jump to the conclusion let's come up with better policies that are human sent Rick to not understand the emerging technologies and the impact they have on the human beings or end users. 

>> LILY EDINAM BOTSYOE:  This was about people speaking about conversations, these policies.  I'll take you and come back ‑‑

>> AUDIENCE:  Thank you very much.  My name is James.  I'm from Cameroon.  I have one question about data.  And ‑‑ I'm sorry.  Okay.  I'd like to know what mechanisms and standards can be put in place to ensure the responsible and transparent handling of personal data by organisations, particularly in the context of emerging technologies like IoT and AI.  Thank you very much. 

>> LILY EDINAM BOTSYOE:  Thank you so much.  I'll take you. 

>> AUDIENCE:  My name is Sam George a Member of Parliament from Ghana.  I've been trying to understand the entire conversation, because the session is supposed to be what Parliamentarians can do for online trust worthy online spaces.  We're hearing a lot about sandboxes, a lot about regulators but not hearing what we as Parliamentarians can then do to create these safe spaces.  For me that's what I'm looking out to see.  That dovetails into what the gentleman before me was talking about capacity building.  You can talk about all the needful regulation and sandboxes, but if the Parliamentarians don't understand the frameworks that need to be put in place, then we're beating around the bush.  You talk about the AUDPF, if we don't have African policies put in place it's one thing to have a data policy had framework.  It's another thing to implement a Data Policy Framework.  We have lots of countries around the world that have policies on the shelves and in the books but haven't been implemented.  For me, that's where the conversation must be about.  How do we get Parliament to begin to be that voice and that reach between Civil Society and regulation using legislation?  How do we get to that point?  That's the takeaway I want to get are from here. 

    The question asked ‑‑ she said it's a mess.  Is it possible?  I think it's possible.  The conversation around 3.0, what can we do about that?  How do we use blockchain and AI to put the power of data in the hands of the generators of data, data users owning their own data and taking away the monetisation of data from big platforms like Facebook or meta and alphabet.  How do we as Parliamentarians across the globe begin to push publication making web 3.0.  As we discuss this, can we ask ourselves how we create a equal when the US government passed the Cloud Act that allows us to get access to African data and Asian data in a way they can't get access to American citizen data.  These are the issues we need ‑‑ we would love to find the solution to.  Thank you. 

>> LILY EDINAM BOTSYOE:  Thank you.  That is where do ‑‑

(Applause)

>> LILY EDINAM BOTSYOE:  What do legislators do?  Right.  Thank you.  So I'll take the last one.  There will be a response.  There's someone online I'll reach out to.  The floor is yours. 

>> AUDIENCE:  That was an excellent point you just made.  Building on that if I may, where legislation has been enacted, I think it's important that legislator then monitor how it's enacted ander it's abused.  For example, GDPR is fantastic.  Most companies abuse legitimate interest to gather my private data anyway.  And it takes quite a lot of effort to delve into the menus to opt out from things. 

    It needs to be more enforcement of acts which are enacted. 

    Similarly, a lot of the tech companies focus on privacy.  They treat it as an absolute rather than a conditional right.  Through encryption and so on to give privacy.  In reality, a lot of that encryption undermines security.  And one of the issues that we have is when internet standards are updated, they're deliberately updated to ignore the policy impacts of changes to standards. 

    Part of the problem is that legislators are not in the room when those standards are discussed.  So virtually no governments are present at the IETF, the standards body that does most of these standards.  That's a real problem because it means there's no accountability to the standards bodies.  So sending people there is absolutely essential. 

    Then finally, very briefly, one thing about privacy, we talked about some of the human impacts.  I appreciate there's a tradeoff between privacy, but one of the big issues which is difficult and is ignored mainly by the tech companies is how to detect and move child sex abuse material?  I think that's a scenario where legislators need to act.  Dare I say, deal with the hugely aggressively lobbying in the tech sector which uses privacy and tends to try to ignore the enormous problems created. 

>> LILY EDINAM BOTSYOE:  Thank you so much.  So we don't lose sight, I will attempt at a response and I'll come to you in a bit. 

    So for our panel members, you can take the question, we have four questions right now.  Those you can reach out and respond to it.  And then I'll come to that.  I'll start with you. 

>> ALISON GILWALD:  Thank you so much.  These points made around capacity building and implementation are absolutely critical and have been really a very strong focus of the African Union data policy framework.  When I spoke of the framework was the framework introduction in the first phase of this project.  Unlike very many of our other policies which are great on paper and as mentioned struggle with implementation, the African Union data policy framework has an implementation phase which is the second phase that is passed the implementation phase is kind of a plan of action, the strategies that need to be done.  It includes a capacity building self‑assessment tool and a monitoring evaluation tool for States.  There is support through the GI zed to support countries wishing to implement the self‑assessment tool, it's very comprehensive.  It's not just a data protection law.  It's looking at judicial reviews and aligning legislation and that sort of thing down to the establishment of regulators, protection and other regulators as well. 

    So that is an implementation phase that is on and training is going on at the moment through the regional economic communities.  And the network of data protection regulators, network regulators and the African Union implementing ni PAD.  There's a strong effort to implementation that doesn't undermine the enormity of the problem.  Part of it Sam has been aware of is the Parliamentary training through the network he's very involved in.  It's just as we said it's completely ‑‑ the challenge is it's so uneven, we've got countries with full data protection and data environments.  They've got regulators already established.  More importantly from a Parliamentary point of view you were speaking about.  They have Parliamentary portfolio committees and developed a level of expertise to cover this area.  This is so complex.  No general Parliament could be on top of all of these issues.  But if you can get the specialized skills for standards, accountability, that kind of thing. 

    But I can't emphasize enough the critical role in an ordinary democratic process of the importance of policies.  When we say, you're speaking about policy and regulation.  It's the regulators ‑‑ if the legislation is set up correctly that is responsible for autonomous bodies with oversight of what they're doing and the challenges of implementation that they need to support in terms of resource allocation and budget and those kinds of things.  I think it's a central to getting the capacity in Parliament and supporting institutions. 

>> LILY EDINAM BOTSYOE:  Right.  I'll go to you, Tobias, and then to you Laura. 

>> TOBIAS BACHERLE:  There are many, many good points on policy maker and capacity building. 

    I would just say it's something we're working on as well.  As soon as something rises and this also goes to the Csome topic they try to hit something on it whether they're aware ‑‑ in German we would say ‑‑ anyway.  Not translating that they're just trying to hit that.  So in Germany we have the debate, people were discussing in telegram channels and vaccine is not working and conspiracy thought.  You have the digital services act and a discussion on the policy maker level and those working on it, is it an online platform, how can we regulate that?  Where does private communication end?  Yet people are like, can we just shut it down?  We're like, no.  It doesn't go away if you shut it down.  It might changes platform.  First of all, you can't that easily shut down a messenger. 

    I think that's very important.  The only bright light I see with generative AI they get frightened.  When they get frightened they come to dumb conclusions often but they also get interested.  I think that's a very huge opportunity ‑‑ for example, we know open AI use data they're not allowed to use. .  We know open AI had click workers under horrible conditions to create ChatGPT.  They can't lose their eye on it, I believe. 

    One thing about the United States.  As you might know, as Europeans we're kind of in a struggle between them because we believe the Cloud Act is a problematic thing and it's not complying with the GDPR.  So we have that.  Now we have the data privacy framework.  I don't believe we've had it for a long time which is an agreement.  In general to have the idea we have a regulation, either you comply with it or you can't store personal data of European citizens. 

    Of course, this is a very drastic step.  But I think to have that drastic step in the room and be able to negotiate what's allowed and what's not allowed is a very, very important step. 

    Last but not least, sorry, I believe private conversations need to stay private.  That might be a very German view on it.  Because we had within the last, not even hundred years two horrible dictatorships in Germany. 

    And I know that sometimeses in a matter of security it's easy to ask for a mass surveillance.  First of all, go there, and your law enforcement whether they're actually able to do their work on the data they have.  And this also goes tore Csome because in Germany they're not able to.  They don't have the capacity to go after every single file they get sent. 

    The second thing is, always think the misuse whether it's intentional or by technical errors that can happen.  I think in a matter of CSUM it's important.  There are people who are victims who rely on private communication with their attorneys, with journalists, with people they trust.  It's incredible important that they can rely ‑‑ that their private communication about those very, very private issues stay private.  It might even endanger them if it does not stay private. 

    So my take on the CSOM debate is let's keep the private communication private because it's also for the good of victims and other marginalised groups.  And finally do everything we could do against it on other terms, on chain analysis, blockchain analysis, to try to figure out who paid whom.  On the question on prevention, so many people in schools are not willing to talk about the issues, have never heard about that.  And the hotline we have in Germany is not even available around the clock.  That's in most European and most States around the world case.  As long as we don't do that, don't try to penetrate secure communication. 

>> LILY EDINAM BOTSYOE:  Thank you.  Over to you, madam Laura and back to the floor. 

>> LAURA TRESCA:  Thank you so much for the point, comment.  What can Parliamentarians can do?  My two key messages were, one, test the utilization beforehand.  Don't do it by your head or the traditional way you always had been developing the legislation. 

    And the other key message was adopt multistakeholder approach because it's important to put the human in the centre of the legislation. 

>> LILY EDINAM BOTSYOE:  Thank you so much.  I'll take questions from the floor.  I'll ask about also the issue of privacy but in a way we're looking at abuse when it comes to privacy.  That's a question that came about, allow do you have a good balance when we talk about privacy and what we can do with regards to abuse.  I'll take the one on the floor and the one online and take the question about privacy and abuse. 

>> AUDIENCE:  Thank you very much.  I'm a manipulate of the Albanian Parliament.  We have received very good remarks here.  When you're thinking with the the stakeholder, we should have different dimensions.  On behalf of GIZ we know the users should have a right to decide that their data should be a part of the data gathering or not.  And I'm in favour of our colleague Tobias that he said that the people private conversation should be saved.  That's an okay idea.  Just we know it's something in reality.  We as a member of the Parliament who are working in the government, we should have the idea that we know that all of these multinational companies, alphabet, or meta, Twitter, or recently (?) they're gathering the data of the people.  And they are selling this.  They're manipulating some of these datas.  And what the government should do about that, defending the data of the net Zens in their conversation.  I think it is one of our duty in the Parliament to defend the people of our country to have their own privacy.  It's also about the government and also about the multinational companies.  They are a government that can get this data and change the different idea of making some political decision.  What we have seen last United States election.  I think that is very important and one of the duties that we can discuss it in IPU sessions.  Thank you very much. 

>> LILY EDINAM BOTSYOE:  Thank you so much.  I'll take the second one.  Yeah.  So the second question and third and the last will be the one online.  So over to you. 

>> AUDIENCE:  Thank you and good morning.  I'm Jane from South Africa.  I'm part of the ICT Member of Parliament.  I think one must acknowledge the fact that you've elaborated on the issue of responsibility of us as Parliamentarians in terms of legislature.  I think that's where we need to start and we must ensure we take everybody alone.  People must understand what is the process with respect to ICT and Internet Governance. 

    I have a question with regards to what you said, Tobias, on issues of principles.  I think it's about time that we get to understand that we need to have a uniformed approach when we get to issues of Internet Governance.  Because it's one of the key services that actually makes us cut across in terms of services socially and economic.  There's a need of us uniformlising that process so all of us it doesn't become apples and bananas.  We have an approach on Internet Governance.  This is the standard and this is how we need to go so we don't compromise any part of the process, be it for the user or the organisations. 

    And I want to check with Laura in terms of consultation to say how do you make sure that you bring the marginalised or those who are in places to ensure that they make input in terms of your drafting of the bills?  Because at times when we speak of ICT, we seem to be creating more gaps in terms of inequalities and imbalances.  I want to get to know how do you make sure that you bring everybody along? 

    Number two, do you have any data with yourself in terms of different demographics that you could have consulted during the process of your (?) today.  If so, you can share with us. 

    I think, Tobias, what you said is important.  We cannot separate privacy to security.  It's two hands, left and right.  There's a need of us as countries to speak in one voice, one message in terms of why it is important that everybody gets to understand that she has or he has a right to have privacy.  So that they're able to even share knowing there would be accountability for misuse and other things.  Thank you. 

>> LILY EDINAM BOTSYOE:  Thank you so much.  Last one before online folks and back to our panel members. 

>> AUDIENCE:  Hi.  Good morning, everyone.  I'm a Parliamentarian from Nepal.  I have three questions.  One, what are some ways we can collectively work to regulate large platforms.  EU has a lot of negotiation power Nepal doesn't in content moderation and various platforms.  Fake news is becoming one of the key ailments in society right now.  I'm thinking of the ways that we can tackle with misinformation, disinformation and outright very fake news that are spread to incite amongst communities.  Is there something in your policies that we think collectively as global citizens what are demands or requirements we put on the large platforms? 

    Second is, well Parliamentarians need access to the latest and greatest that's happening.  What are some of the resource pools that are available for us to learn about what's happening?  People talk about generative AI.  I just learned about the Cloud Act.  Is there a Secretariat, is there a place that I can go or I can send my people to learn about what's happening?  So then I can proactively work on drafting laws? 

Look, we are at a stage where we are still drafting cybersecurity bill in Nepal.  We are very far behind finally, what are some of the things ‑‑ some of the best practices, like when you go about getting input from people.  You need experts that understand the technology I think there's technology and acceptance for tolerance for news and also balancing the freedom of speech together with the security issues. 

    So how have you gone about creating ‑‑ do you have working teams to work in different streams?  Like what are some of the best processes as a country goes about drafting, what are the basic frameworks and how do we contextualise.  Thank you very much. 

>> LILY EDINAM BOTSYOE:  We have a group of questions, some relating to each other.  There's a concern of not leaving people behind and how to do it collectively as Parliamentarians across the world.  We've seen the issues of capacity building for Parliamentarians and the issue of legislation.  How do legislators make sure there's legislation of overseeing organisations and the work they do.  And the recommendation from our MP about how we can show up as legislators for every MP in here, how to show up for people who are end users?  All of these are relate the.  I'll allow the panel members to speak about it.  And we'll round up with online participant who has been so patient waiting for us. 

    Madam, Alison, you want to make the floor.  Actually, I'll start with you Laura and then you Tobias and round out with you. 

>> LAURA TRESCA:  I wish marginalised people could take part in the discussion.  I can see several measures to facilitate this participation such as the language, the technical translation for common language, training.  But the fact is marginalised people are worried about their basic needs.  And although we have multistakeholder approach in Brazil and we have Social Security organisations take part in the multistakeholder model, the Brazilian Internet Steering Committee, the organisations that have a seat on the Brazilian Steering Committee they are specialized on digital rights. 

    So they are ‑‑ I myself ‑‑ I'm a representative of a Civil Society organisation.  We have background and we have social justice sensitive.  But we are specialized people.  And I think this is something that we need to seek how to bring more marginalised people to the discussions and we don't have find the way yet. 

    So when I mention the the participatory process, it was a niche of organisations and people that took part in this on this process.  But it was very distributed.  You mentioned the demographics around the country. 

    But we have this profile of participation.  And the way I respond for the second comment about how can we have the technical support for the Parliamentarians?  And what we have done in Brazil is sit together.  And this is very important, because sometimes we have an idea and they said, this is not technically possible or it's technical possible but the cost of this solution it's impossible. 

    And about how to regulate the large platforms, I think the word is we're seeking for an answer.  At the beginning of this year about disinformation, there is a draft of some guidelines, how to address disinformation with which was pointed.  And maybe there's a tentative way but we haven't found this path. 

>> TOBIAS BACHERLE:  Well, on what can we do as global citizens against a rising problematic approach use of data from big tech but also misinformation?  First of all, I would start with microtargeting.  That's their business model.  To restrict that is the most efficient way.  The question on how far to restrict that, that's up for discussion.  What we did in the European Union we restricted it from collecting data from minors.  It's basically not allowed anymore.  Students know it's minor and need to stop.  The other is political acts. 

    Political acts is not the most distinguished definition, but it means within preliminary campaign you're not allowed to have microtargeted ads run anymore.  And the other part is also, of course, transparency.  We moved there already a bit by now.  You see on Facebook if someone is running a political act, you can click there and see all the other ads the person is running as well which is very important.  If I tell one part of the population, I'm advocating for topic A and the other part of the population I tell them I'm advocating for topic B when in fact I'm advocating for topic C.  That is false information.  That brings me to one other point I think we're going to have another session on misinformation and disinformation as well. 

    But very quick, my two cents, what we can do together, judge the distribution not the content.  Because the distribution is manipulated very often and platforms know that.  Of course, I'm now telling what Francis hoggen said about Facebook.  They know what accounts are fake but they can't admit it and they would lose a certain degree of their accounts and their net worth.  The manipulationful distribution is very, very harmful. 

    The second part, context matters.  Most misinformation is just spread by accident.  Some is spread with bad intention and a lot is just distributed with bad intention.  If you label it with context, for example, if you have a very small snippet taken out of context but the context is reimplemented underneath it shown there, just as a link, it's an important tool to give people an entry in the discussion, to give them, well, there's something you should know about that as well. 

    I think that's something where regulation can take ‑‑ can step in ‑‑ sorry about that ‑‑ for talking about context, I think we should rather lean towards platform councils.  I think media is basically running that in a very, very efficient, very good way since ages.  Give the users again the power to add certain disclaimers that there's missing information in that. 

>> LILY EDINAM BOTSYOE:  Thank you. 

>> ALISON GILWALD:  Thank you so much.  Just in response to a number of these questions, I wanted to say that I think we've got ‑‑ especially in Africa.  We've got a lot of challenges that still exist at the national level.  There are a lot of issues we still need to sort out at the national level.  Effectively there is an elite conversation when it happens in Parliament because most countries have less than half the people online.  We have people that are invisible in these algorithms and various things.  There's a lot of work that has to be done on the national level. 

    Increasing governance or global governance, I think the importance about the harmonization of the African Union Member States is it would allow us again to speak, represent African interests better in these international fora.  There are many things going on with UNESCO guidelines on platform governance, with various countries, safety acts and things going on that influence global governance. 

    So I think we have to be far more engaged to represent our interests to be able to align with the broad principles.  Importantly in regard to the member from Nepal's point, also to take those international examples of practice and look at them in one's own context.  I think that's where we don't have enough sharing where it's more similar it's interesting to hear about the Brazilian case. 

    Many of the very complex legislation or assumptions around, firstly, democracy, institutional endowments of countries that we take, GDPR, et cetera.  And we are unable to implement them successfully in our countries.  I think what the African Union framework tries to ‑‑ it's very different political economies and institutions going on there but tries to get a common position there. 

    Practically what that means for Parliament that I was going to raise is we've got a lot of legacy issues that mean that, for example, we're dealing with these very complex, multisect recall issues that require transversal policy in telecommunications committees and ICT committees and we're trying to deal with issues of AI or transparency and stuff I think one of the things that Parliament needs to do is look at those Parliamentary committees ‑‑ I think many country does and there are many that don't have multisectorial committees.  You need economics and security in there, you need science and technology in there. 

    Very often at the moment we're getting siloing of a lot of this policy.  So there's actually (?) it.  A lot of AI is happening within science and technology.  Not speaking to the basic conditions that I spoke about in the communications sector as well.  I think internal reform within the Parliamentary processes but also a problem for many of our countries, lack of participation in the policy process.  A lot of that skill, a lot of that specialization is silting in the privately sector, Civil Society.  We had misinformation in the South African Parliament.  I think that's welcome.  Far more participation in the system and harmonization of our national system so we can be more effective in terms of global governance. 

>> LILY EDINAM BOTSYOE:  Thank you so much.  There's an audience online O. Celine, do we have the question ready to be read before we go to the floor? 

>> LILY EDINAM BOTSYOE:  Then I'll come back to ask about where do Parliamentarians get resources to be updated on this?  This is one of the things that ‑‑ is there any place we can find information to keep updated on some of these issues with you're talking about?  I'll bring that back to the panel.  If the online question is ready, we'll take the one on the floor now.  You can have the floor. 

>> AUDIENCE:  Thank you.  My name is (?) and I am MP from Irans Parliament.  I had the challenge regarding the subject of this session.  One of the requirements for every democratic system is for people to have the ability to shape their own lives and spaces.  For this requirement to be fulfilled institutioned and representatives, people who work for must take precedence over corporations that neither make or greatly influenced by people's votes.  We know full well this is not the case.  In our world today multibillion dollar platforms controlled by a handful of corporations have an extremely open hand in controlling narratives and changing cultures and values with no regard for wishes of free, independent nations. 

    Of course, a handful of world powers control these platforms and force them to abide by their laws but few governments are granted the right and opportunity to question these corporations the way American representatives, for instance, interrogated the TikTok CEO or the way European regulators fine this corporations hundreds of millions of dollars.  So the question that remains is how we can ‑‑ how can we representatives voted for by our nations can do our job to keep virtual spaces safe and protect the public's information when major mutual platforms can only be held accountable by foreign entities with no care and concern for our people's interests? 

    Yeah, I think we should answer this question first and can go into more detail later. 

>> LILY EDINAM BOTSYOE:  Thank you so much.  I'll come to you quickly.  You can come. 

>> AUDIENCE:  (Microphone muted) because our countries we have over 75% of the internet users of our total population and come second in term of users per person is (Audio breaking up)

    (Digitalisation of socioeconomic development.  But we also see it also has a dark side.  On the brighter side so we have besides the digital government development plan which is to (?) the government's platform and also we are working on how to digitalise of governance and guidelines and also the emergency decree on cryptocurrency to ensure the supervision and activity digital access.  As I said, in any two sides of the coin, digitalisation also brought people encounter the threat and became the victim of Cyber criminal. 

    Over the past year we have over 300,000 cases of deception.  We have the people lost their money over $1.25 billion US dollar.  We had to issue quickly on an act of cybercrime besides each country I think international co‑operation (Audio breaking up)

    And the cybercrime has no.  Last but not least we also have the personal data protection act issued by the Parliament and the government since last year.  The PDPA, the personal data protection act comparable considered to the level of European GDPR, when it's implemented the PDPA would change the landscape of personal data in Thailand and therefore I think in this Parliamentarian Track I think our (?) digitalization must come with some spanlt for the dark side of the coin.  Thank you speakers and moderators and audience.

>> LILY EDINAM BOTSYOE:  Thank you for the intervention and for Thailands.  Can I ask your question in one minute.  We'll get response from the panel members. 

>> AUDIENCE:  I'm from Liberia.  My question goes to madam Alison from AUDF, the issue of this policy framework one of the issues we have in South Africa is the promotion of these policies.  What are some of the Prague matcal realistic timeline you have put in place to ensure that, for example, the DPR in flick but if you look at some of the policies from Africa, at the local level in the countries there is no access.  We see it (Audio breaking up) in Africa.  My question is, it will help align policies in countries like in my country and Liberia, how much effort is able to push it at a level to ensure that content is clear to what it need to be in a localized policy making? 

    Next one is we want our Parliamentarians (Audio breaking up) capacity build but we want to capacity build at a level when there are issues of data breaches, data privacy issues to bring the big tech companies to our country how TikTok was brought to the American Parliament.  That's the level of capacity you want ‑‑ that we in Africa also have the platform we use, a level of TikTok, Facebook and others is very popular.  This is what I'm trying to get these answers to. 

>> LILY EDINAM BOTSYOE:  Thank you.  So we're going to take the final words.  And it's going to be ‑‑ so sorry.  I see one more member here.  Our panel members after the session to be able to talk further. 

    Let everybody say their closing remarks and we'll round out the session.  Thank you. 

>> ALISON GILWALD:  Thank you so much for that question.  So you're absolutely right that it's very difficult to implement these topics, arranged policy in a very short period of time. 

    I think why we were able to the Member States were able to act quickly on this because unlike the Convention we all have to be in every country.  It's a very high‑level documentation.  Supplementation framework is a practical document.  As I said, it has a number of tools with it. 

    So you actually do ‑‑ the speed with which the domestication of the policy can happen will mean the free trade area and all the kind of benefits we see from harmonization.  I'm not trying to undermine it.  It's an enormous task. 

    I think timelines (Audio breaking up) of the realization of prince he will Pells and activities.  I suppose there is this resource to support implementation capacity building, those kinds of things.  It's dependent on the Member States and perhaps that's where Parliament could play a role.  It's dependent on the Member States at the speed of which he how they want to access those resources so they can get (?) et cetera. 

    (Audio breaking up) framework which was about six months after it was passed.  There were very few Parliamentarians who had been formed by governments and signed off on this legislation even within the governing parties were even aware this framework was in place.  Improving those communication lines where Parliamentarians are more engaged at the African Union level (Audio breaking up) and being allowed to get the resources in order to implement is going to determine the speed at which we get the domestication and harmonization. 

>> LILY EDINAM BOTSYOE:  Thank you. 

>> TOBIAS BACHERLE:  I think because you left the question last time where can we learn from each other, I think it's places like this and I think we need to strengthen that.  We need to strengthen the Parliamentarian it track and keep it running at the IGF and everywhere where we have certain discussions. 

    When it comes to capacity building of Parliamentarians, I don't know the TikTok situation in US.  I had a good laugh watching that especially when we started to ask about the Wi‑Fi.  I could not follow but it was hilarious.  People watching it (Audio breaking up) ever.  It streamed on TikTok as well.  And I think this is very important to understand.  If we face those issues that affect so many people, we need to be well prepared.  Because if that gets embarrassing, people are watching and consistently watching.  They need to believe that we can tackle those issues.  If they see us asking about Wi‑Fi when we're talking about TikTok, they're like, okay, we need to handle it ourselves. 

    But then we come back to the beginning, that's so important.  Because we need to trust the people of the users to share data.  Data is knowledge.  And we want to have that knowledge and we want to thrive on that knowledge.  Therefore we need trust.  We need data Sarinty and transparency and control over the data and little steps we can take, tackling big tech, share our data estate, have a data cockpit for any personal data we have control on any occasion we can.  I think that's the line we need to follow. 

>> LAURA TRESCA:  Regarding the big tech ‑‑ I think the large platforms they created a problem for us when they are selling cars or bicycles and then they start to sell (?) (Audio breaking up) self‑evaluation is not working, definitely. 

    But we haven't reached how to regulate, what to regulate, what is the measure?  I think the debate is (Audio breaking up) on this topic.  So we have a long path in front of us. 

    Regarding the question about how Parliamentarians can create expertise, we can establish expert groups.  You can hold public hearings, conduct surveys, this kind of thing.  I think it worked well in Brazil. 

>> LILY EDINAM BOTSYOE:  Thank you so much.  So we had some questions that we couldn't get to.  But our Parliamentarians will be here and our speakers to engage in the topic after the session to talk more.  We will try to understand the role Parliamentarians can play in supporting a trust worthy online space.  (Audio breaking up) capacity building to be able to create policy and regulation that protects users.  So first the right of information to be able to protect users who are also asking how you can help.  It's been a great session.  I hope that you learned some best practices from Germany and Brazil and the African continent and also Thailand.  We can have things to learn from.  Thank you for being a part of this and thank you for the talk panel members and for you for listening.  We appreciate your time.  And do have a good rest of the week and enjoy the IGF.  Thank you. 

(Applause)