IGF 2019 – Day 4 – Estrel Saal C – OF #45 Information Sharing 2.0: privacy and cybersecurity

The following are the outputs of the real-time captioning taken during the Fourteenth Annual Meeting of the Internet Governance Forum (IGF) in Berlin, Germany, from 25 to 29 November 2019. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 

>> ISABEL SKIERKA: Hello, everyone. Welcome to our open forum on Information Sharing 2.0. We're very excited to discuss this topic albeit the last day of the IGF, and I hope that we will have what the open forum is supposed to be, which is an open discussion amongst all of us.

Let me first introduce the session very briefly, and our panelists will then have a brief discussion amongst the speakers, and you're very much invited to discuss with us.

My name is Isabel Skierka, I'm the moderator here. And I work at the European School For Management and Technology and have a policy background. So I will be moderating.

Now I will introduce this open forum.

So our session today will discuss information sharing 2.0 with the focus on cybersecurity information sharing and privacy and how the two might conflict or can be reconciled.

As many of you in the audience probably know, sharing actionable information about vulnerabilities, malware indicators, mitigation measures, and other information really strongly promotes cybersecurity.

As policy and law have evolved, a lot of questions have been raised about the privacy implications of information sharing, and especially for practitioners, this has raised a lot of questions that they will need to address. There have been legal frameworks such as the EU General Data Protection Regulation or the American Cybersecurity Information Sharing Act from 2015 which attempted to tackle these conflicts, yet a few uncertainties remain.

The goal of this open forum is to discuss ideas and best practices about information sharing and how to reconcile cybersecurity and privacy in this space.

So the experience that will be shared here should ideally help and inform global policy makers as well as the global community and information sharing accident responds privacy community as well to take away some of these best practices and implement them in their own environments.

So the questions we would like to discuss pertain to some legal factors and consideration behind these legislative texts to how these provisions have been understood and implemented in different sectors, public and private in different countries. What does this the mean for global interoperability in this area.

We have an excellent panel to discuss these issues today. We have two representatives from the cyber directorate here.

We have Amit Ashkenazi, who is in the middle here. He's the legal advisor of the Israel National Cyber Directorate in the prime minister's office. His tasks basically encompass deploying the directorate's domestic and international legal policy. He has been active in the field for a long time, including also not only security but also data protection or copyright law and government issues. Before, he also held roles amongst others at the Ministry of Justice.

We are also joined by (?) she provides legal advice pertaining to international law and cyber law and is responsible for the directorate's international agreements. So we have both this domestic and international perspective on the panel. Prior to that, she worked at the Ministry of Foreign Affairs.

Here to my right, we welcome Andrew Cormack. Andrew has been around in the community for a long time already. He joined the UK's National Research Network 20 years ago to run the computer emergency response team there. For the past 15 years, he has looked at regulatory and security issues related to data and education. Andrew has a very interdisciplinary outlook here, knowing a lot both about the technical side of incident response, the practical sides, and the legal issues he himself often has to solve.

We'll start with Andrew who will now give us an overview a little bit of these challenges that he has faced and his experience.

>> ANDREW CORMACK: Thanks, Isabel. Learn to use the technology. I started by thinking what would happen only if bad guys shared, if the law really didn't allow incident response teams, vendors, researchers to share data, which seems to be the perception sometimes. We would end up with viewer anti‑virus solutions. We would end up with no coordinated releases about vulnerabilities because all of those involve sharing. So your personal devices, whether that's your laptop or phone or baby camera or security are all going to be vulnerable. Anything on the Internet, things get compromised. Typical time is about an hour or two for a vulnerable device.

Because you said good guys can't share, not only can they prevent things from happening, they can't tell you when you have actually been successfully attacked either. So the state where all of those devices are listening you, perhaps becoming you is permanent. Nobody can warn you. You can't detect from inside the laptop that the laptop has been compromised.

It struck me that a city like Berlin is probably the most appropriate place to be to imagine what that state might be. I was in the GDR museum, which is absolutely fascinating. The idea of having most of the laptops in the country to be compromised could blow their minds. That's certainly not a state that leads to privacy. What we need is a law that is part of privacy law. I think Isabel said there's a conflict ‑‑ well, hinted there's a conflict. I can the devices are the key to the online privacy.

I want a strong privacy law, not a security law or a network law or national security law. So there's none of this suggestion of, oh, we have to balance this law against this one. It needs to be clear so that CSIRT teams can get on with protecting the Internet and not worry whether they need the attacker's permission to share data. I've heard serious suggestions about we can't share this data about being under attack without the attacker's consent. No. You're not going to get it. So that has to be wrong.

But also limited, so we look at what we share versus what the threat to users are. Shouldn't be blasting information everywhere. It should be sharing what is most necessary and benefit.

It needs to be broad scope. In Europe, we're running into a risk with the new privacy regulation because that will cover only network operators and give them different privacy rules from websites, or appears to, and that creates barriers to sharing. If I wanted to share data with you, and you can do stuff with that data that I can't, then I'm going to be quite nervous about trying to do that.

But, in fact, looking at recital 49 of the GDPR, which we could have up on the screen behind us as the theme for this, I think it's pretty close to that requirement spec.

I will leave what we can do better for later. There are a few things. It's not perfect, but it's pretty good. Ultimately, I think what we have to remember is this point that security people protecting the network, protecting computers, and privacy people are very much on the same side because both of us need to keep information secure from bad guys. Otherwise, our main desire isn't met. What greater privacy breach can there be than having somebody else in control of your laptop or your smarthome? I hope we don't get in a position where the law forces us to do that.

>> ISABEL SKIERKA: Amit, do you want to weigh in a bit more on the legal context as well?

>> AMIT ASHKENAZI: Thank you. Indeed. The way we approach this in the evolving discussion between technologists and lawyers is a need to balance and need for framework that gives certainty and allows cyber defenders to do their job. This is the constant dialogue we're having. I would like to share a bit of the mechanics about the legal analysis when we approach these issues. I should state that privacy is a constitutional right and also a right. We have a data protection law. We have even been found adequate by the EU. For sake of clarity, I will leave that as my reference point. If you like double clicking this message of how do we really help the network defenders with the law and balance between privacy and other issues?

First of all, we need to take into account that privacy law and the GDPR itself talks about, on the one hand, protecting individuals, but on the other hand enabling free movement of data. So we're constantly balancing. This is a core concept in data protection law since the 1980s. The privacy principles, enabling but protecting harm to privacy.

Usually this is useful to look at recital 49‑style rules. I would say article VI of the GDPR related to the general interest. Without going into too much legalese, we believe cybersecurity is very, very important and enables protecting privacy as a core concept within the GDPR operator support, and, also, it enables protecting other important society functions and interests on computer networks.

And then we need to go to the second test, which is where the dialogue with the technologists is most fascinating. I'm using a diplomatic word here. The question of necessity. The question we ask ourselves as lawyers giving legal advice to technologists, this is the technical measure that will support the mission, and is it completely necessary? For instance, do we need to retain all logs of a specific device in order to understand what is going on and to be able to later, if we have a breach, God forbid, replay the security cameras.

The lawyers need to do the balancing act, so we need to look at the context of what we're doing and what are the risks to privacy of the intended technical operation, and is it, if you like, net positive?

In this area, there is another interesting interplay between cybersecurity and privacy because one of the factors that we take in is the mitigation strategies, which is actually how we can reduce the risk to privacy that we have identified, and, in this sense, we take cybersecurity thinking and turn it against itself, using cybersecurity measures to audit ourselves to make sure people are only accessing what they need use automation, and other measures to protect privacy within this discussion.

The final thing I've seen in this discussion is the need to have a better understanding of the different elements of this analysis so that we can have a more informed discussion which goes from the high level principles, which we all agree upon, to pragmatic solutions. This, ultimately, I think supports the defenders and reduces the risks to them and their activity. It allows their stakeholders, managers, counselors, others to be more supportive of the mission.

>> ISABEL SKIERKA: Okay. Thank you, Amit. We'll continue right away for the international perspective and how that can inform the debate internationally.

>> Thank you, Isabel. Hello, everybody. I would like to share with you a pragmatic point of view based on my discussions with legal advisors to cybersecurity authorities from, of course, the world when we try to come up with the education work for this cross‑border cooperation. When we do that, the question we're faced with what border cooperation might look like. Very soon the discussion turns to the question of how to create a common basic ground for sharing of information, just as Andrew said, when you understand that on the other side you have common ground with trust on the other side, it makes sharing information much more convenient and comfortable.

So we're basically facing similar dilemmas, but we're subject to different framework and different domestic laws. As you're probably well aware, legal mechanism, regular mechanism such as legalese regarding transfer of information are when it comes to cybersecurity. We have a different world as legal advisors in this area. We often see ourselves as more as the backseat lawyers. We know we need to lower the walls and build the bridges between the tech community and the government by creating common language that will enable this transfer of information.

I think that Recital 49 is a good start for developing domestic rules that would be interpreted and implemented in similar ways amongst state, and it may create the trust and confidence needed in order for us to enable this sharing of information. And this, of course, promotes cybersecurity and serve privacy.

Thank you.

>> ISABEL SKIERKA: Okay. Thank you so much.

I would like to ask one of you, just before we start the open discussion, it still sounded quite easy, from what you described. You know, everything, it's clear. Security and privacy are obviously related, and information sharing works well. We'll respect privacy, right. And we have Recital 49 of the GDPR, which, by the way, basically says, of course you should respect data protection rules, but because sharing of threat is so important, there are exemptions in this field. Maybe you can read it out again.

What are the actual conflicts, and what should we do here? I think this is still important in the discussion. Thank you.

>> ANDREW CORMACK: I think one of the challenges is particularly when working with government teams, that they often have a basis in governing law. So there can be a barrier there, either legal or in perception because quite a lot of national CSIRT is part of the National Security Agency, and that's a slightly worrying look to those of us outside. In that position, you have to be really strong in presenting we are a SIRT. We're not part of the security agency. I think there's huge misunderstanding and ignorance in Recital 49. I don't think people know it's there.

>> ISABEL SKIERKA: Can you summarize it?

>> ANDREW CORMACK: Sure. The GDPR ‑‑ the important bit ‑‑ Recital 49 says that, basically, processing of personal data that is necessary ‑‑ this is horrible. It's going to open it up to check that I've got it right. Processing personal data that's necessary in order to protect the security of networks and data may be a legitimate interest of a wide range of entities.

And the interesting words there are "legitimate interest," which takes you to article VI. It's article VI1f. It's the most important. In article 61f, you have to say the information sharing I'm doing is the least intrusive way there is.

Even if the sharing is necessary, if the risk it creates to individuals' fundamental rights ‑‑ that's not right. Any rights, freedom of speech, if my sharing of data creates a perception that I am Big Brother, so people stop changing how they use my networks, I have to say, No. The rights of the individuals override the sharing that I want to do. I did, about 10 years ago, write a toolkit about information sharing, saying what data to we want to share, if it's an IP address, that's probably less sensitive than if it's a real name or an email address because an IP address is less reusable.

Am I sharing it directly towards the person who could benefit from this, or am I sharing it more widely? The risk is higher there if I'm sharing it widely. Are there shared laws or a self‑appointed agreement on how we use data. Only use data for the purpose of protecting networks and not offensively. Sometimes, working with government, it's a big tricky, if I share a vulnerability with you. Several governments are being really good about this and publishing the terms on which they will use knowledge of vulnerability either to fix the vulnerability or they may keep it in reserve in case they want to hack another nation's systems, and there's a lot more openness about that process, which I think is really good. But working with government SIRTs, that's a risk factor.

The other tool we used ‑‑ I mentioned turning security tools against themselves. We've actually turned the full legal tool. We've done a data protection impact system for our operation center for a network that has 18 million people on it, all universities, colleges, most schools in the UK. So we can see an awful lot of data, if we need. We've done a DPI, and that's public that's been shared with information commissioners and the European Commission, and you other very welcome to read it. We've tweeted things we're learning, but that was good experience.

>> ISABEL SKIERKA: We may come back to that.

Would you like to say something as well.

>> AMIT ASHKENAZI: I take your criticism that this is somewhat academic that we're having discussion between people who have invested a lot of time to understand what the real issues are. This is part of the Divide between technologists and lawyers for a long time. The technologists usually don't want the lawyers to meddle in their affairs.

In this context, one concept I want to add is the issue that requires anyone doing these types of procedures with information, having processes in place to make sure they're doing it in a proper manner. This serves a as framework to analyze these issues.

Data protection, in general, is a complicated issues because it involves values around societies. What we're trying to do here is show these elements are less controversial and maybe easier to grasp to make them more accessible to technologists and lawyers. It may be stopping negative results, which we have seen sometimes, where either technologists don't ask the lawyers about things we don't like less or they don't ask the lawyers because they're afraid of the law. This is useful in this type of conversation.

>> ANDREW CORMACK: Don't expect yes‑no answers in this field. As for everything else, this is about risk management. You're relaxed about risk management when it comes to managing computers, using computers. Don't insist on: Is this compliant or not? Because the answer most always certainly be: It is not known. We can give you an estimate of how likely you are to get in trouble, on both sides; but one of the things that I found really my technical colleagues struggled and still struggle with is they come to me and say, Can I do this? And they expect a yes or no. I come back with a convincing story.

If anybody asks ‑‑ and this is part of the accountability that Amit mentioned. We have thought about this. We have looked at what we're doing. This is how we'll justify it. And that, to me, is a much better thing to provide data subjects with than a statement of: We're compliant. Look at compliant with ISO standards and work out just how reassuring those are. I think my claim that my convincing stories are more useful.

>> ISABEL SKIERKA: Thank you for that practical insight. No yes‑and‑no answers.

I would like to open up the floor to everyone to please come forward if you have anything to contribute or questions, arguments to make, please. In the meantime, you are free to contribute something while people walk up to the mic.

>> Elliot: Hi. I'm Elliot. I'm here with the Ambassador program. I'm from Australia. I'm just wondering if there are any other approaches to national laws? I understand we're talk about GDPR and article 49. Are there other laws similar for cybersecurity purposes that maybe aren't so based on the GDPR, and are they quite different? Thank you.

>> AMIT ASHKENAZI: So thank you for the question because it enables recalling that the basis for this conversation, which is based on sharing of information and based on trust is that we need a common language to discuss privacy. So as the world is becoming more and more driven by privacy laws, the GDPR, but the OCED principles, Israel has a privacy law. Australia has a privacy law. We have something common to discuss.

The actual legal domestic solution to give clarity for these types of situations is different. In Europe, this stems from the GDPR. In the U.S., as Isabel mentioned, this is written in the cybersecurity law. In Israel, we're developing this both under privacy and under our cybersecurity law as something that we're doing. And, basically, you could imagine that maybe the European Data Protection Board one day or other form of regulators could issue a statement giving guidelines along these issues, and, thus, taking the conversation further.

The question from a technical‑legal point of view does it have to be in the law or not is another issue. For the international basis, it's more helpful.

Thank you.

>> Mark: Hello. I'm Mark from Microsoft. I'm in the regulatory affairs group. I would like to talk about certainty and risk management.

So, as an example, you're probably aware of the WHOIS identifiers in the domain name system. So what we're seeing, a company like Microsoft, we can spend a lot of resources and have lawyers working closely with technologists and coming up with, as you say, defensible, credible stories. The confidence is fairly high that the things we believe we should do are things that can lawfully be done.

In the ICANN environment, the domain system environment, the controllers of the WHOIS data, they are not in a similar situation. They have smaller companies. They have less exposure to data processing law, in general. They're certainly not applying as many resources to GDPR as we are. And when we share our interpretations with them, their reaction usually comes in two forms. Civil society says, well, Microsoft obviously has some sort of agenda, and, therefore, we should disregard their inputs. And the data controllers say, well, I can't really verify your claims because I have fewer resources, and if you're wrong, you could withstand the fines and the defenses whereas we could not.

And I think we've seen this with the Ministry of Justice in the Netherlands who asked us to review the way we did data processing in Office 365. We worked with them for several months after handing over the DPA. Then they determined we were reliable and they should be our customer again. I'm sorry. That was a long preamble to the actual question.

Given that, our legitimate interests are sort of subjective. So you have ‑‑ well, this is the situation. Who is performing the balancing test? This is really the trick. I've seen some advice ‑‑ not our advice ‑‑ that really comes down to, you know, if you're comfortable with this, then it is probably lawful like this. What I'm seeing is given ‑‑ based on the comfort level of the various data controllers, the responses could be very, very different, to the point where the whole access regime is completely unpredictable.

And I'm wondering how can we better test this law? How can we create concrete cases so people can look at them and say, I believe that my story is credible, not just because I've applied a lot of reasoning, logic, and expensive lawyers to it, or gone to Brussels to ask questions, but simply because there's case law I can point to. How can we move forward in that situation so people can have more confidence?

>> ANDREW CORMACK: I fear the case law ship has sailed because ICANN has already sued one of its own registrars and has lost on that. So there is actually case law pointing the other way.

The problem with WHOIS data is ICANN has not taken the very strong hints over 15 years from the European Protection Board, that they should split functions. There's a pretty easy and obvious case, again based on legitimate interests in security in Recital 49, the fact that the EDPB has stayed that the reason for you not reporting a breach was that you didn't have a monitoring function that was able to detect the breach, that can be an additional fine. How strong of a vote in favor can you want? That's such a long sentence I've forgotten where it was going.

I think there's a strong case ‑‑ yes, I know. The security side should be a no‑brainer. I have written an analysis ‑‑ is it called the Berlin Group. This city has a lot going on. The Berlin Group wrote about this. Not for law enforcement, not for rights holders. They can get their own responses. I suspect the best way would be to get something from the EDPB ‑‑ again, since they are pushing the security side, it might be possible for them to give an even stronger hint. They've already given very strong hints that they are really pretty relaxed about who is being used for security purposes.

>> Mark: This is also our interpretation. You're proving my point. You're very comfortable with this situation. I am very comfortable with this situation. And, yet, the data controllers, they still are not. They require some convincing. I'm wondering how to put that forward.

A side note, I believe there northeast longer a clean split since trademarks and things are good bait for phishing. We find very, very often our trademarks are being used in phishing scams. The water is clouded. Again, people are saying Microsoft has their own agenda ‑‑

>> ANDREW CORMACK: I would use the phrase "shooting yourself in the foot," if you do that. Stick to WHOIS data in looking at registrations. The problem is two entities: One, someone who is willing to do the showing, and, two, some authority that's willing to have a look at the showing. ICANN, unfortunately, has taken out a lot of options for those. I don't have any confidence we're going to get this sorted anytime soon. The problem is ‑‑ not just the problem. The problem and the solutions that the regulators have been offering has been being ignored for 15 years while we went further and further down the wrong path. Maybe it will take 15 years to get back to the right path again.

But looking at some of the stuff that academics and, increasingly, some of the more advanced response teams are doing with WHOIS data, with DNS data, it's becoming among the best sources. I used to say flow data was the best way to find sources for bad stuff. I'm now seeing papers that convince me that actually flow data finds an event very quickly once it's happened. DNS data can actually find preparations for an event, and that's really exciting.

So at some point, somebody has got to notice that and say, Maybe we need this data. The best DNS stuff doesn't need WHOIS. The researchers have basically given up on WHOIS for a while.

>> I will be short. This discussion exactly exemplifies what discussion we want to have. What social institution do we have that gives us the most clarity. This is one take away. I note Microsoft's position in having different players play on the playing field, and I don't want to specifically discuss this issue, but this is what we want to show you. There's a rash need for policy makers because we can agree socially what is the net positive solution.

And the last thing Andrew mentioned is an excellent example of how we need this framework to be dynamic. Now we can have an expert opinion saying DNS data is necessary for this type of mitigation. Maybe we couldn't have said that several years ago, and this pushes the need for reformulation of the balancing exercise. With that, I agree with the delineating purposes. You're balancing out different interests, so it doesn't balance out. That's my technical note.

>> From north Macedonia. I was wondering where is the (?) Framework. We are not part of the European Union. We're the members of the Council of Europe. Where does this fit in the relation framework we have mentioned? Thank you.

>> AMIT ASHKENAZI: So, without getting too technical, I think having, in general, these high‑level concepts, they can be engrained within the Council of Europe convention either through the analysis of what processing is legitimate and also maybe in other, I would say, safeguards that enable processing for protection of important interests.

So it has maybe a different connecting points. Council of Europe convention 108, and, actually, we took the opportunity here in the IGF to discuss with the Council of Europe colleagues, if they could help in this journey of clarifying and putting out a similar statement. Hopefully, this is something they will consider. They're looking at the way information security is relevant in all of the Council of Europe's documents.

Thank you.

>> I'm on the Share of the Incidents Response and Security Teams. I'm happy to see data protection laws acknowledge a need to share information for security purposes. So that's kind of a silver lining on the horizon, but I see dark clouds. And those clouds are called sanctions. Sharing data among our friends, but we have an increasingly hard time actually talking even to people that are in someone else's garden, so to speak. So a number of countries that are being not talked to, an increasing number of organizations that, at first, we had to exclude ‑‑ we can't even talk to. I wonder if we need to go down a path where we say sharing information for security reasons is so important that it should be examined like medical or humanitarian aid. I'm exaggerating. I'm not a lawyer, but I would like to hear your opinion on that.

>> ANDREW CORMACK: Don't move again.

Yeah, just to agree, I think the sanctions thing is a mess particularly because there was a paper and research done just a few years ago basically congratulating the incident response community for being able to work in ways that diplomats can't. And that paper was trying to encourage the ‑‑ they call it science diplomacy. Apparently, it's a thing. The paper was trying to encourage support for that. What I've seen in the last year, actually, is the reverse, that this important stuff has been drawn, instead, out of science diplomacy into real diplomacy.

I think we have to find a solution. I don't know what it is.

>> AMIT ASHKENAZI: I mean, we can say this is part of having the Internet that's more relevant to issues that affect this dialogue.

>> Our job is to ensure that public should have access to government data. The problem we have there is an issue of either deliberate manipulation. Okay. Data should be protected, but it's manipulated politically. We have one of the best law, the Center For Democracy. They have ranked us one of the best law. Everyone living on the globe should have access to the government data. The politicians in certain government entities, they consider this, okay, we have to protect this data, cybersecurity and this and that. They do not share it with the public. And then it becomes a big challenge for us to push them to share it with the public because everyone should have. It's a human right. Just a comment. Thank you.

>> AMIT ASHKENAZI: I think it's important for the scoping of the issue ‑‑ and thank you for the last comment. It's up to us to look at the information and let other policy regimes, legal regimes, institutions in our country deal with the issue of content, if at all. So it's very important for us that we protect the infrastructure and the information in a horizontal manner whereas we may have other vertical regimes looking at and discussing the difficult value questions that apply in different contexts. For instance, copyright versus free speech or intellectual property versus free speech. Defamation, incitement, et cetera. This makes it easier for us as legal advisors. We focus on the things that make the Internet and the computers work. In that sense, our lives are easier than our colleagues in other organizations.

>> ANDREW CORMACK: Quick observation. I think Europe has just set up that problem. In addition to the GDPR, there is a very new ‑‑ I think it's a regulation ‑‑ on the free flow of data, which is supposed to be very much your issue of having access to government data and the definition of the border between personal data and non‑personal data in that is very unclear. So I think we're running rapidly into that problem that somebody will have to sit down and say, Well, actually, at what point does something become personal data. And the GDPR‑type regime, confidentiality and strong controls apply, as against the open government regime.

The other area I'm hitting at personally, because I work for research, there's a big push to open research data and trying to make available for reproducibility, the raw data. That can open up where does that data become personal data. I think there's going to be issue in Europe because the research parts of the GDPR are largely given to individual member states. So I agree with you there's a problem. I think a lot of countries and regions are going to be needing to look at it soon.

>> Yeah. If I may, even when it comes to personal, some of the officials, in a span of one or two years, you know they don't have a penny, and all of a sudden, they become millionaires. If I'm government, it's a human right to know that he had nothing and, all of a sudden, he became rich. Some of that data is personal but the person is defined it's a government official and when it's a government official, data should be shared. It's an issue of corruption in the government entities and that's why it's manipulated. You're protecting the organization, certain data, and even in security, according to our law, there are certain things that cannot be shared, movement of soldiers, et cetera.

But what you procure for the soldiers, there would be a practical disclosure so people know when they're doing it. That's a concern there.

Thank you.

>> ANDREW CORMACK: Again, there are big differences. The cultural thing, in the Nordic countries, I think it's the tax that public officials pay is regarded as public. I'm getting a couple of nods. In salaries, okay. Whereas, in the UK, what your salary is ‑‑ no way am I going to disclose that. Even in countries that we think are very ‑‑ we regard as very similar ‑‑ well, I do, despite our recent issues. It's a very big cultural difference as to how pay for job is considered. We had a big scandal over expenses for members of parliament several years ago, which now some people are tracing distrust in politics right back to.

>> ISABEL SKIERKA: Thank you. I think that was another fascinating point. I think there is also some discussions on this topic at the IGF. I believe there's a dynamic coalition on publicness. That may relate a little bit to what you just pointed to.

Since we don't have that much time left, I would like to move the debate back a little bit to the core of our discussion, which pertains very much to the exchange of security information, threat intelligence, and so on. I think we've made some progress here in the discussion, talk about certain approaches that we might take to move this issue forward to find some agreement beyond the European Union framework or the U.S.'s framework. We've talked about the Council of Europe, that you already had discussions with them. So I would like to ask you again ‑‑ and also all participants in the room ‑‑ what would be your concrete suggestions, maybe steps forward, both practically but also on the conceptual level, like the legal level, technical level.

>> AMIT ASHKENAZI: So I think we need to embrace the uncertainty we're seeing. There was a great report this week discussed at the IGF about the Internet and jurisdiction team. It talked about challenges on the Internet as one of the next big things that will affect us. In this area, this may be too dangerous to our corporation in cybersecurity. So we need to embrace the differences and continue this dialogue where it's a domestic dialogue between security experts and lawyers, and it needs to have connecting points internationally.

So this was the point of having this discussion here, because we hope to show that when you analyze the question according to recognized principles, you can reach rational solutions that can support Cyberdefenders. I would suggest to countries that are developing their data protection or cybersecurity legislation to embrace this issue and look at the global best practices and try to make their laws interoperable. They don't have to reach the same value solution that maybe the EU has had or Israel is going or the U.S. because this is local.

If there's a common denominator between jurisdictions, then this is a useful thing for a corporation. I note very seriously the question from the audience coming from Microsoft about the level of legal certainty and how we should get it, which institution is the best to do that. This is something which is also country‑specific. Doing technology for a long time, I think the law should be more open‑ended, and there should be institutions that could dynamically interpret it; but I take into account that someone, as Microsoft noted, may think this is subjective. So this is a challenge we need to deal with. This is what I suggest.

Thank you.

>> So maybe I will just add that the tech community is sharing information all the time. They have their own rules in order to make sure the other side that received information, preserved the information in a way that suit them. The TP protocol is an example. Us, as international lawyers, giving advice to government, needs to think about how we corporate with another government, and we create the legal framework for this cooperation to continue, we need to make sure we are respecting those rules and the language that the tech community already created and adding another layer. If we understand that on the other side, there's this legal framework, for example, privacy, it could promote the cooperation between the tech community and also help us getting agreements between governments.

>> ANDREW CORMACK: If I can really concrete proposal ‑‑ and it may look deeply scary ‑‑ get a lot of coffee. Get your legal and technical team in a room for half a day and do a data protection impact assessment. You will learn a lot about what each other do. I was in a privileged position because I have done both. Data protection officers had exploded. Some of the techies' heads exploded as well, but we ended up with a document that reassures us that we're doing stuff right and safe. It reassures our customers, and it reassures others that they can share data with us and that we will use it in the spirit in which they intended.

>> ISABEL SKIERKA: If I may, I'm wondering about ‑‑ I think there is a big issue here about capacity building as well and information sharing, not like technical information sharing but sharing information among the communities. For example, I wonder whether the first incident security response teams set up working group in this space or if there's an international forum where you can do that, where there's platforms for sharing information. I think those would also be concrete.

>> Our legal advisor is actually sitting on a panel. He doesn't know he's the official legal advisor. We're happy to be supported so much. We come up with frameworks that allow us to automatically share data in a legal way. The amount of data is growing at an enormous rate, and we're pretty good in sharing person to person, but that doesn't scale. So we have a special interest group that works on a standard that allows machines to share. These appropriate share rights and all this stuff is implemented. Having said that, I do feel that the tech community needs to reach out more to the policy‑maker community, to the legal community. A lot of the members of the tech community are very reluctant to talk to anyone who speaks a different language, and that's a challenge we're working on.

>> ISABEL SKIERKA: Thank you so much. Does any one of you want to have a last word about this?

So I think we have a few seconds left. The one thing I would take away ‑‑ oh, yeah. Really fast, please.

>> ALEJANDRO: Thank you. My name is Alejandro. I'm from the University of Mexico. Pleased to meet you in person.

I will bring in briefly something I've also mentioned in other panels. The origin of the rules that we are dealing with, the ones you are now struggling so hard with, especially in the SIRTs community, it's coming from different regimes. We're not plugged into those. We have the multistakeholder regime. We have the multi‑lateral where rules like the GDPR are coming from. We have the GG and so forth. We have to get the technical community engaged so that their rules are implementable.

>> ISABEL SKIERKA: I'm getting signs that we absolutely have to wrap up. Thank you so much for joining this panel. Join me in applauding our panelists here.

I think one takeaway, there's still a lot to be left done. I think this is one of the first sessions that actually addressed this issue, like practically here at the IGF. Maybe all of you could connect informally and see where you can continue this discussion and in which forum.

Thank you.