IGF 2017 - DAY 2 - ROOM XXIV - WS123 - Internet of Things and Cyber Security


The following are the outputs of the real-time captioning taken during the Twelfth Annual Meeting of the Internet Governance Forum (IGF) in Geneva, Switzerland, from 17 to 21 December 2017. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 



>> ARTHUR RIZER:   Okay.  So, we're going to get started.  Sorry we're a little late.  The panel before us went a little late.

So, my name is Arthur Riser, Director for National Security

and Criminal Justice Party at the R Street Institute, and for

full disclosure, I think as a moderator I should disclose kind

of my perspective and where I come from.

We are a libertarian think-tank, and generally side on the free market.  At the same time, I also served in the 20 years in the U.S. Army, so I have concern when you start talking about the Internet of Things and cybersecurity to things that effect more than just, you know, the cell phone in your pocket.  We have to be more aware and more concerned about what we are doing.

I am not going to get into a long discussion early on about, you know, what the kind of the outline of this panel is.  But I will say this is not actually a panel, like you are normally used to going to.  Maybe.  This is actually a debate.  With that said, you know, this might be a little more intense than some of the other things you've been to, specifically the panelists are encouraged to go back and forth, ask questions to each other, which is not something you normally would see on a panel.

So, you know, the massive development of network devices ranging from cell phones to your toaster has really changed the way we think about the Internet of things over the last couple of years, and they have been deployed in mass across markets, across the entire world.

So, when we're talking about toasters and even watches, and even sex toys, yes, sex toys are connect towed the Internet now, we have to start thinking about our way of life.  You know, protection against criminals.  Privacy interests.  Privacy against the government.  Privacy against our neighbors.  And, Consumer Protection ensuring the devices that we have act in a way that we expect them to.  But, as we move forward and we're starting to see very clearly that we also have situations where our very lives are on the line.  If you break in to a camera or you hack into a camera, all right, now we're complete, you break into a camera on a refrigerator, your privacy is violated.  If you break into the braking system on a car that is automated, your very life can be in danger.  So, we have to have new language and new ways of addressing these risks.

So, today our debate aim is to explore the options available.  It is, I hope this doesn't end up being a debate where we vote at the end of the day and say I'm for regulation and I'm not for regulation, but instead to exchange the options and questions ranging from consumer trust and user centric security approaches.

Technical issues related to cybersecurity, big business of the Internet of things, privacy shield.  I haven't been to an international meeting where somebody doesn't have me a committee about privacy shield.  Political issues, the law of economics related to the high-tech sectors.

So, with that, I'll very briefly ask each of the individuals up here with me to introduce themselves very briefly.  Their bios on online so you can read them, but very briefly introduce themselves and add to the definition that I have laid out.  We'll get into the meat of the debate as we move forward, but is there anything from the definition that we could add to?  And we can start with Arthur van der Wees.

>> ARTHUR van der WEES:  Thanks, Arthur.

>> ARTHUR RIZER:   That's a great name, by the way. 

>> ARTHUR van der WEES:   Yeah, it's great.  Easy to


Hi everybody, my name is Arthur van der Wees.  I'm a lawyer from Amsterdam, global practice around the world, AGI and actually all our continents, and I want to indeed mention a couple of notions here to you.

One, fix fast.  Built fast, fix later.  That is the current moto in the principle that is used.  We build very fast bras we want to be the first on the market, and we say, well, you know, we can do security later and privacy and data protection and data management, you know it is the wild, wild west anyway, so we will see later what happens, and perhaps the regulators will comment later in five to ten years and we're able to push it off and say it's not yet clear what is happening.  So, that is my first notion here.

The second is, actually that is the we cannot wait, we are too late already, but you are never too late to start all over again, while also taking into account legacy, of course.  And, we need to, of course, take into consideration what we already have.

One of the things is to note that no market, no society is static, so forget about static.  Anybody that says we need to wait and (Audio disconnection). 

Talking about liability, then we could have maybe a productive conversation about the role of liability in the Internet of Things and how far that should go, recalling that we had this conversation about software liability ten, 15, 20 years ago, and the software companies all freaked out and said we are going to die if you impose liability on us for software problems.  So, I guess that is my bottom line point to start with.  If you say if you invoke regulation don't make it a God that will just do the right thing because you're appealing to it.  Apologize if anybody is religious here, and actually but define what you mean by regulation, tell me how is it going to work institutionally legally and politically. 

>> ARTHUR RIZER:   Great.  I this I that rounds out what we're trying to find and what we're going to debate.  Let's kind of get started.

In my readings and my study, I have learned that many of the experts that actually proffer for regulation, they don't typically actually come from the experts in the political economy, or regulating intuitions.  Most often they O come from technical experts who aren't familiar with the challenges that it takes to regulate something, to actually do to promulgate the law.  In essence what I'm saying is they don't know how the sausage actually gets made.

Dr. Tropina, right?

>> TATIANA TROPINA:  My name is at the prop and I'm from national criminal law from Freiburg, Germany.  I would like to build a point what Milton said, because as a lawyer, theoretically, I could be for regulation, but I see some elements lacking here, and as Arthur just mentioned, I didn't see many political economists or lawyers advocating for regulation of IoT.  Why?

Not only because we don't know whether it would be national or international regulatory agency.  For me it is not clear who are the subjects.  What are we regulating?  Because, when we are talking about Internet of things, it could be anything from self-driving cars to toys to (?) and whatever.  So, normally when we are thinking regulation, we are thinking industry, right.  Sometimes cross industry.  But this regulation would be all over the place.  So, define an object and subject.  And, I think this is going nowhere.

Then, the question is, well, what exactly are we protecting with this regulation?  Are we protecting the safety of consumers or are we protecting networks?  Because, on the national level, if we care about our consumers, we can regulate some things, like for example, in some countries we prohibit the export or import of toys which are dangerous, you know, for health.  We can do the same.  But, what if I smuggle or just bring a device which is not safe, something small size of a button, you know, and here comes my next point, enforceability of all this.

So, if we are protecting consumers, how are we exactly going to enforce it cross-border?  And if we are protecting consumers, how are we protecting if networks?  Because regulating some IoT devices in Country A will not saVe the network itself because it isn't interconnected.  The effecting Country be where it is not all regulated can be the same danger.

And, I agree with Milton that if we will talk about liability, I'm ready to talk about this one.  I think this is the point where we can talk about damages, about liability, and so on.  But, again, as a lawyer, I will not call it regulation.

And, I know that this position might look a bit weak from the point of view, oh my God, well you have so many questions but you're not offering the solution.  I can offer some solutions except from liability and

damages.  I think that if we think and make the problem focused, like for example, there are many people who make analogies with after industry.  We can regulate the cars, why can we regulate Internet of Things.  Okay.  So, if we define the supports, we show the most dangerous, self-driving cars, and we say we're going to regulate these particular vendors, or producers, or whatever, this would be maybe enforceable.  But, then we have to talk sectors, we have to talk industries, we have to talk maybe even national borders, but we cannot talk about Internet of Things as a whole.  So, maybe targeted solution.  Targeted regulatory solutions.  Targeted imposed in the system is where you can ensure that you enforce them.  This might be a solution.  But, not all this overarching concept of regulation which will be waving a hand and just thinking because it doesn't get us anywhere.

Thank you.

>> ARTHUR RIZER:   So, Professor Donahue, we've heard a lot about this cross talk among individuals in this area.  You know, as somebody that has, you know, the government official, how is the government thinking about these things?  And, you know, very specifically to Dr. Tropina's statement, this moves fast and how is the government going to keep up?  Because, listen, we all know that cybersecurity changes as fast as somebody's fingers can type code, but that is not the way the government works.  The U.S. was designed to work slowly, not so much in Europe, but that was part of our system.  So, how is the government thinking about one keeping up and how are they thinking about trying to tackle this cross talk of defining regulation and appropriate way that can get everybody to the table? 

>> Mr. O'Donohue:  Thanks.  Well. I was appointed director


last week, and you just given me a Professorship, as well


(Laughter).  This has been --


>> ARTHUR RIZER:  I meant to say director. 

>> Mr. O'Donohue:  I am with a team of future networks.  It is interesting.  We work on Internet of Things, free flee of date and you Internet Governance because we see all of these issues need to be dealt with together.  There is another team in the department that I work which works on cyber security and I will tell you later what they do.

I was (?) what was already said because I think it was really important.  E we can always nitpick the title of the session, but it does help us to identify already what we're talking about, because it's what sort of regulation is actually in mind.

I would say if we tried to go for the standard mandate harmonized requirements which require conformity assessments, which require realtime regulatory overview, we're more or less wasting our time for reasons that have already been given.  We're trying to SWAT flies, except that there are going to be millions of flies and some of these are bees and Hornets and some of them are was s and we're likely to get very badly stung.  We have to find another way of dealing with this.

So, we have to think also moving away from the normative type of regulation to things that would be more inventive that might lead to a result, and also things that can frame and structure the market in the way the companies behave.

And, secondly, that when we do find that is probably a balance of things, we have to stay very much at a level of principles, and I fully agree with taught tee Anna, we have to look at specific sectors and specific applications for more of the detailed regulations, so that would be appropriate.

If we do anything more principles at the first level of cyber, well as has already been said, it's not state of the art anymore, it's already history.  So, we will always be missing our target.  Again, we'll be swatting the flies and missing.

So, for us what is really important is that we work with not just industry, but we work with independent researchers in academic.  So, the work we do starts with research, and then implementation.  But then we also work on what we would call regulatory policy.  How do we do this framing and structuring of the markets as they move so rapidly?  Because, in fact, part of our work is based on hearing contradiction; that is, that we need to develop and actually support the take up of these technology goes and services for the good of society and for the good of economy, but we also have a mandate, which is equally strong, which is to protect the individual, to protect the rights, the dignity, and of course the privacy of the individuals as well as the well-being of society.  So, there is that security mandate which comes back into it, and the only way that we can actually do that juggling act is of course working with Governments, which is what you asked me about, Arthur, but we have to work with all of the stakeholders in this room so that we don't have regulatory policy, which suits the services or the manufactures but doesn't suit the individual and the citizen.  And, and this has already been stated as well, if we only think about the citizen, we can, of course, just ban things and stop things, but is that actually good for individual, but certainly is it good for society in the middle to long-term.  So, these are some of the issues that we're dealing with when we seek to define the problem. 

>> Arthur Rizer:  Director Botterman, I read some of your work and this idea of the moving target within the cybersecurity world, that the bureaucratic process is just not set up in order to handle this, how do we deal with the rapid technical change in this area, and, you know, kind of adding on to what we've already talked about, how do we deal with the problem of the moving target? 

>> Maarten Botterman:  Thank you, Arthur.  Yeah.  Director on

the ICANN board will not speak in that capacity.  I have done a

lot of work on IoT in collaboration with multiple stakeholders

as chair of the dynamic coalition on the Internet of Things. 

And, that's really looking at good practice on the global level. 

How does that work?

If you look to the main title of the session, regulation or not, it's so important to consider that also in a multistakeholder level, regulation has a role, but the role is not sufficient in such fast changing times.  So, if you talk about the rapid development and how to keep up with that, I think we need to take a responsibility as stakeholders together in solving this problem and making sure that we develop a world we want.

Now, I think a very clear example of where it didn't work is actually data protection, where we see that we were all aware that privacy was an issue that wasn't always addressed well, and both government agencies, commercial agencies, and even end users were not very careful about it, not very (?) pricing at time more and more of her life will be digitized, and

it thus becomes more important that no abuse of this data is made.  And, it wasn't until GDPR, which woke up the world, and really looking at do we need to do to be GDPR compliant.  Well, I would hope that in cybersecurity we don't get into a similar situation, that the pressure of dealing with it is postponed because of every day economics to a point where the costs for really implementing it will be tremendously high, and to a point where regulation will force us to either apply a certain level of security or else, like the GDPR.

So, in that way, we are very much looking at the responsibility on dealing well with IoT on different levels, which includes the end user, but from an end user it should be clear that you don't expect the end user to do a part that it can't do.  And, there I would like to make a slight comparison to health care.  We cannot make our general prediction responsible for our health alone.  We need to do something there, too.  I think in responsibility of IoT environments, it's the same thing.

So, we all have a role.  Regulation is important, and I agree with the other Arthur, principle based is unavoidable, harms based (?) maybe continuing to bridge the caps between the current European thinking and thinking elsewhere in the world, but for sure it will require all of us to take action.  It will require industry for moving from time to market towards standing out as an excellent responsible producer of tools and of services. 

>> Arthur riser:  Great.  Thank you very much.  Let me ask a question to you Professor Mueller.

When I grew up in the 70's, I remember getting thrown in the


backseat of a car and you just said a little prayer if you were


going to make it to the movies or to get popcorn, because we


didn't even know what a seat belt was.  And, the market didn't


do a really good job, because there were studies that were done


and people when they bought cars, at least in the United


States, didn't care about seatbelts, despite all of the


evidence that they made people safer.


The government was proactive, and they were ahead of the curve and they insured that seatbelts were part of a standard practice within the community.

I wrote an article recently in wired magazine that I actually chose teledonics as a catch because I thought people would read it and I was talking about sex toys, and hey they read it, but I was amazed that people didn't care that much about their privacy.  The consumer just didn't care that much.  It really blew me away.  And, I'll say that, because as a libertarian, I believe that markets can fix everything, and this kind of made me question that religion that I have.

So, you know, now we strap our kids in the backseat of cars like they're getting into a fighter jet, but when I was a kid, we didn't do that.  The markets didn't fix it.  How do we kind of those two, those forces coming at each other, how do we come to grips with it?  Lots of questions. 

>> Arthur van der Wees:  Yes.  So, in terms of when consumer choices don't actually lead to a socially desired result, there are form of analysis that you can use that distinguish fairly clearly those situations from other situations.  And, I think that in the case of automobiles, you had a very well-established product, a very well established industry, and one thing not to overlook in terms of the role of the market is the cost of implementing these regulations continually fell as technology improved and the competitive forces in the automobile market intensified.  So, you had, you know, I guess the purely ideological response to your question would be, well, if people want to take that risk, let them take that risk.

And, the solution that most societies have adopted is to say that hardly because of social cost and externalities we're not going to let you take that risk, we're going to end up paying your medical bills and you're going to possibly cash into somebody else and so on, so we've installed lots of safety devices by regulatory mandate.  And, it has had the desired effect, there is no question about it.

At the same time, a lot of the new technologies that are truly market driven, you know, we just got a new car, a Subaru that beeps when you go out of the lane, and it has the automatic breaking, this was a factor in our decision to buy this car.  I think a lot of people like that.  Pretty much it becomes standard equipment after a while and more and more people.  So, the move towards auto safety, I think, has been a combination of government mandates, and market incentives, and it would be wrong to emphasize one over the other. 

>> Arthur Rizer:  Mr. Van der Wees, how do you respond to that, there is an ideological response, people know the risk and they take the risk and government should put out.  At the same time is that the market eventually does catch up, which I think is generally true, but there is a stopgap where we have loss of life.  While the market is catching up, how do you respond to that? 

>> Arthur van der Wees:  We have over dependence on independent systems and products.  Digital use to be fun and used to be nice to have.  Now it's a need to have.  So, you cannot choose between analog or digital.  With IoT it is going to be the same, whether it is connected to car or drones or whatever there is.  I mean, you can choose of course between connected and whatever a home appliance, but you cannot buy a normal TV anymore, it's smart anyway.  I don't think it is the free choice of the people, society actually to choose.  I think it is choose by industry, because they make more margin, they can ask more money for that appliance and for another.  So, that is one.  So, it is not a nice to have, it's a need to have.  And, therefore it is a societal issue.

The other one is that when we did a network shop earlier this year in Brussels together with the European Commission and the alliance of (?) where industry is also involved, we did a session for breakouts on security in IoT.  So, which principles, which baseline minimum baseline requirements would you like to see in your vertical.  And, we did smart cities, we did appliances.  So, like wearables, as well we did some industry 4.0 and we did autonomous vehicles.  And, after two hours they came back to the plenary and they found out that they came up with 30 principles that are all applicable for all these sectors.  So, this is not always purely sectorial issue.  We can, I think, to a very large extent get to a minimum baseline on these principles.  So, perhaps anybody knows the red flag act.  The red flag act was way back when the first car was to road.  The car was not allowed to drive except if there was a human being in front of the car with a red flag.  So, that is, of course, not a regulation we're looking for anymore.  But, indeed the seat belt is a good example, did actually the consumer ask for it?  I don't think so.

And also, here, you see that we are either waiting for things go wrong, we can anticipate what will go wrong.  IoT is nothing new.  It's a combination, and a convergence of technology already on the market.  So, again, saying this is so advanced and so new, I totally do not agree with that.

The drone, the autonomous drone, or any semi-autonomous zone is already highly regulated, because apparently markets are regulating, including member states and also states think it's something that they need to regulate.  So, also there you see that they do step in sometimes.  You're not allowed to play with your drone in a city, but you can of course use it for smart agriculture and for all kind of other stuff.

The main point that we're missing out in privacy regulation in European union we already have this since 1995, but nobody really did something with it too much.  Why?  Because the enforcement level, the big stick to hit people was too low.  The maximum fine is 150,000 Euros for high scale providers and big tight tans, the industry, that is pocket money.  Now we are getting indeed as mart teen already mentioned GDPR, we get more muscle, and apparently, we need it.  I'm totally with you on liability and enforcement but that does not cover it all in weight and see what happens and there is some debt on the ground and see whether we can get in force.  I don't think that society people are looking for that.  So, it's all about I don't think there is a free choice.  Secondly, I think people want in the societies including member states, organizations, and companies want to have control over their own stuff. 

>> Arthur Rizer:   Dr. Tropina, would you like to respond? 

>> Tatiana Tropina:  I would like to respond whenever these debates -- go to these debates, I remember how I got a couple of gray hairs in Mexico last year, because every Uber I took, they didn't have safety belt.  They just didn't.  The same at the ICANN meeting in Morocco.  In the taxi, there were no seatbelts.  They were broken.  We were driving without the seatbelts.  And, while I cannot bring this car to Europe and drive it here, because I will get stopped, you know, by the police at the first turn, but I still can buy a sex toy from that Country or any small device and bring it here, and it will be unsafe, and it will be my consumer choice.  And, no one can prohibit me from this, and this is where, you know, we kind of -- this is where we're missing point and the point is the enforceability of this regulation, because the car is visible, and if I’m missing a seat belt I will be stopped, but in this case it creates a very complex system, when you in poison the rules, and as Martin said, that industry has to start action responsibly.  Yes, maybe, but to me it sounds a bit like rainbows and butterflies that industry will start acting responsibly and consumer will go for safer choices.  Consumer will go for cheaper choices.  And, industry, as we know from the industry, for industry many sectors regulation started being you scratch my back, I scratch your back.  There is lots of lobbying.  You know, if you can avoid regulation some way, will you probably avoid because it is all the ways the opportunity.  So, for me the big issue here is, first of all, the enforceability, and secondly, how do you enforce these cross-border, while I do believe in Europe that might be something to be done and in Europe we do believe in regulation, but then it all just crashes once we think about cross boarder environment. 

>> ARTHUR RIZER:  Director van der Wees, you seem you want to


    >> ARTHUR VAN DER WEES:  I want to disagree with you. 

   >> ARTHUR RIZER:  That is why you're here.  That Arthur, not

this Arthur. 

>> ARTHUR VAN DER WEES:  I don't think we can meet complete parallels to the past.  There is something new in IoT because it is ubiquitous it is all around us and because the unit cost of an IoT device is plummeting through the floor, so of course that means people are given that choice of cheap equipment, they are more likely to take it, but of course that as the industries are given the opportunity to drive down cost as the unit value of an IoT sense or device is so low to add the equivalent of a seat belt actually completely changes the business model.  When seatbelts had to be introduced into cars they were expensive, but they were not all that expensive compared to the unit price of this complex model.

In the same way, that is why the automotive industry is looking at adding lots of IoT, lots of sensors and wireless devices to the automobile, because it's an expensive platform, and they can actually have some market differentiators.  And, they can also capture data.  But, for a simple IoT sense or, which is worth $2.00 or two Euros, you know you have to have a very strong incentive to put in some safety, to build safety into that.  So, that is where I disagree with Arthur.  That, Tatiana, let me put it this way.  I agree with Martin and nobody else (Laughter) because in fact, we have to do a number of things.  Industry has to be forced to face its responsibilities.  They need to have at least the threat or the backup of regulation if they're not doing it, but of course we also have to work and find ways to encourage strongly encourage the user, the citizen to actually adopt in this case what we, we didn't have the phrase, but cyber hygiene.

So, what was it that led the user of the automobile, Arthur, to actually put on the seat belt once the seat belt had to be built in?  There had to be further education.  In some cases, in member states in Europe it actually became obligatory.  First of all, you had to wear seatbelts in the front row, the front seats, so the kids didn't seem to platter, and then the backseat.  So, we can certainly look at these examples from the past, but there is sufficiently new features in these new technologies, particularly Internet of Things that we actually have to look forward and think again about what this environment is. 

>> ARTHUR RIZER:  Arthur, and then Tatiana, if you could


>> ARTHUR VAN DER WEES:  Yes.  My point is that the industry is moving, or saying we need to wait and see as mentioned already.  The -- another very new thing, and that is something missing so far, and I want to raise this, is it it's not about the vendor and customer only.  The vendor may give you a home automation system where you can actually change the password so there are no fixed credentials, well good luck finding them, but they are on the market already, and then the user consumer doesn't care about changing those policies it defaults to something particular, and who is then the victim?  It's society.  Because with these hacks, you very easily hack them, and you can very easily do an IoT hack enabled (?) any industry, any organization in any company, including member states, and states and for instance this organization.  So, it's the user that is the victim here, that means it is not a contractual thing you can take care of, it is the (?) user that we need to protect, and I think, of course, I can try being a lawyer, I love to sue people, I can try to sue both the vendor and all the people that forgot to put their credentials in order to -- in the camera, that is going to be different exercise.  Probably find them anyway, but I don't think that is the way I want to go forward, so there is a mutual responsibility, and the word which I would like to hear and what we see much more is accountability.  It's not only all the way downstream and liability and who is responsible and who has to pay, but the accountability part is based on the principles, account Bill light is also in the GDPR, and it depends on the context, depends on the risk impact of what you are doing.  But, actually, a very cheap devisor sense or has already peers mentioned, it can be cheap and have huge impact.  Even though the cost can be one zero or less, the impact can be enormous, and that is another thing that is on the market. 

>> ARTHUR RIZER:  Dr. Tropina. 

>> TATIANA TROPINA:   Tatiana Tropina, for the record.

I love all the industry has to be.  First of all, I like it because I have no idea what industry we are talking about.  Are we talking about automotive industry?  Are we talking about producers of (?) debt, are we talking children toys, video commerce?  Which industry are we talking about?  Basically, we are talking about everything.

And then we are saying it has to be so-and-so, and by the way we have to encourage them.  So, are we going to make it has to be regulate must choose whatever, or are we going to encourage them.  So, and if we are going to regulate them, I'm coming back to my very first pint, which sector?  Which industry and how?  And, secondly, again, about responsible consumers, seatbelts and whatever, in Europe I would believe we can implement some rules in Europe industry where collateral damage might be very high from a Consumer Protection point of view.  I don't even see how European commission or on the national level we can establish at least some rules not from network security protection point, because here I do have difficulties how it intertwines with cybersecurity, but from consumer safeguard point of view.  But then again, A, where all this is produced and what is the industry, and secondly, how can we ensure in the cross-border environment that my rules that I have to fasten a seat belt here would be met once I travel to Mexico or the Uber and the driver is arriving without a seat belt and what am I going to say?  You know, think about this analogy for the Internet of Things and for the unsafe devices.  Even if I as a consumer want to have a safety vice, you know, when I open the top I want to be sure, you know, you don't think what kind of water I'm getting there.  I'm sure that it's save.  But I'm not sure that this would be the case with any device with the Internet of Things because I have no idea where it is produced, what kind of industry you're going to regulate and whatever in this cross-border thing.

Thank you. 

>> ARTHUR RIZER:  Professor Mueller. 

>> MILTON MUELLER:  I think as Tatiana pointed out the pro regulation camp is kind of squishy on the regulation issue, not clear what they're advocating or whether some of the things they're advocating is not actually regulation, which is fine but I want to deal with a higher level philosophical issue regarding the call for regulation, and that is this argument that individuals making choices in the marketplace are completely ignorant, lack any choice, any real choice, and suddenly when they make choices through their political agencies we're suddenly rational, we suddenly know exactly what we're doing and we never make mistakes.

I think that premise has to be challenged here.  I think Pearse made a good point about cost.  When you're talking economics, you have to be very focused on this.  So, if the cost of an IoT seat belt, whatever that metaphor means is three times the cost of the IoT de-voice, where as in the automobile it is 11 hundred depth of the cost you're dealing with a fundamentally situation in the terms of the rationality of the requirement, so you could actually destroy the entire industry by imposing certain kinds of regulations based on that model.

Same thing with the botnet.  We know that the Mirai botnet by the fact it had these default passwords and that was the basis of the bought net, but people learn the from that and why haven't we seen another one like that?  Why hasn't that happened a Gwen?  It is because a lot of changes were made in the industry, the people who did the Mirai botnet were just arrested, and I think, again, you can't ignore the fact that the market and the actors are learning, and that centralized regulation might not be necessary, at least in the short term. 

     >> Moderator:  Director Botterman, do you want to respond

real quick?

>> Maarten Botterman:  It is Director, Pearse. 


>> Moderator:  You're director on my page. 

>> Maarten Botterman:  Thank you. 

Just to emphasize that to make all this work, and also what we need is to have some transparency that doesn't exist yet.  Where do I go for information about my IoT devices, my IoT services?  Where do I find reliable feedback on that?

The second thing is that, and also referring to the seat belt cost issue that Milton raised, how sensitive is certain IoT equipment and certain IoT applications to privacy, to security, and to safety, consumer safety?  I think we don't have -- we too often still speak about IoT as IoT, whereas it's so much different things.  I think we can mutually advance the discussion by really looking at different levels of sensitivity.  So, it would be great to have transparency exposed, either by Civil Society, or by government, at least those actors that don't have an interest to hide anything.  The long-term nobody has an interest, but in the short term.

Secondly, if they're interested in some kind of certification or (?) set it would be great if government would confirm that it better be true, otherwise we'll correct you on that. 

>> Thank you.  If I may, then. 

>> Moderator:  Yes, sir. 

>> Particularly after what Martin has said, because we talk a lot about what might be the regulation but what would be the flaws on the regulation.  Let me talk about what the European Commission did propose in September, our cybersecurity framework.  First of all, we didn't propose fee detail legislation, we did not propose detail rules or mandatory normative rules. 

What tee did was try to create a framework which has a number of components, including strengthening an existing European cybersecurity in (?) but also creating competent centers in bringing them together which are the technical go-to people with regard to cyber threats, particularly in IoT and other Nuon line services.  And, then the third element was that we seek to development again with stakeholders a certification scheme.  And, we should say certification schemes, because it's important here, and Tatiana has been making this point clearly and she is quite right.  It is not one size fits all.  There is different industry, radically different processes.  Radically different technology goes to a certain extent.  So, any granularity or detail has to be at that level, if it is to work at all.

But, what we need is, again, we're creating the framework that supports and creates at least the outlier parameters in which we can then give confidence to the user and also some guidance to industry.

So, that certification set of schemes would look at what our global or European hopefully global standards where they exist, and also, of course, use their number of certification schemes that exist, but to create clarity.  Does the security standard that is in this device service my needs or my security requirements?  So, eventually, and we have not committed to this, because in itself it could become unworkable, but listening to what Marteen was saying we might go further and again it might be more specific devices, look at labeling schemes in the future.  So, we're not going to talk about sex toys now, but I am going to talk about toys, where the European Union has a very clear and respected labeling system for the safety of toys.  So, consumer fronted sensitive sectors like that are areas where we should see how can we adopt that to the IoT.  So, different schemes, obviously different standards, and all of that would then give rise to a more informed liability regime which Milton has referred to which may be different depending on the sent or and technology.

Quite frank league here, just as we do in the GDPR, we are going to have to have some sanction, even if, Tatiana is pressing on this, even if we're not regulating the industry because we don't know how it would work, we do know that market forces would apply unless there is a baseline, unless there is a rule that says, you must make this product save.  What is save in a digital world yet to be defined, but if we have duty of care if we have standards that are identified as being applicable in a given area, we can also follow up with a liability regime which will punish those who do not follow through, and that is the basis of the commission cybersecurity packages we have designed.  A lot of work to do, and of course a lot of that with the stakeholders. 

>> ARTHUR RIZER:  I'm going to ask one last question to these panelists, and these questions are loaded on purpose, so please bear with me, and then we have about 30 minutes, so we will open it to questions past that point. 

Pearse, I will ask you to go first since you have to leave.  You used the word when you were talking about this has to be -- we have to think about this as a global level, but, Tatiana talked several times about how that's almost impossible in this universe.  I mean, I dealt with as an AUSA federal prosecutor in the United States I dealt with a lot with enNLAP the transfer of data and it was World War III every time we had to deal with these things.  How do we deal with these on a global level instead of just talking about it?

For Marteen, you talked about transparency, but you have multibillion dollar organizations that are going to come back at you with boast fists with guns blazing with the intellectual property involved.  How can the government be, you know, transparent to a level that you think is necessary without having dire consequences when it comes to intellectual property?

Milton, you referred several times about the idea that, hey, people have choices, you can decide if you want to do these things.  There is an argument that in our society today that that choice isn't really there if you want to be the actual member of society.  Sure, we can for go Facebook, but can you really forgo not having a smart phone?

Tatiana, you talked about, you made a reference that since we can't regulate this, and it's kind of an idea that it's too hard that it's bridge too far.  That doesn't sound very aspirational.  That seems that we have basically given up, so if you can respond to that.

And Arthur, you said we cannot fix later.  That was one of the first things you said.  But doesn't the reverse true, as well, that every time in my lifetime, my 40-some years, that we have regulated something early, we almost always regret edit later on.  702 is a great example of creating laws that have these huge consequences after the fact.  Isn't exactly regulation, but 702 came from title 3 which was a legitimate program.

What about the mission creep of the government, that every time they get their fingers in something it ends up being a little dirtier than we expected it to be enlighten the round, try to answer questions, if you want to rebut other people with your time, feel free, but if you can try to do about two minutes each, I would appreciate it. 

>> MR. O’DONOHUE:  Okay.  You asked me first, and I do apologize that I won't be able to stay, because this is a really interesting discussion and I really look forward to hearing some questions.  The global perspective, in fact your question to me is one I have to hit back with a straight but, we can only deal with this at a global level starting with the standards, the with the certification.  We do not want to create starting off as bureaucrats, we're the last person when it comes to writing standards.  But, if we created or created a dynamic for (?), for example, European standards, then we would be cutting Europe off from the Internet, which is not really one of our policy objectives.  Instead it is to work with others to drive up was we are talking about standard and protection.  And, even then, it doesn't give an answer to what happens when somebody brings a device, which is not in conformity from another region into this area.  And, that is where we actually have to work on the technologies.  It's not for now, but it is somewhere again a global level where we can start having to the long-term introduction of IBP6, through engineering of different protocols, we can actually have a situation in which every device, the first time it pulls or logs on, is actually has its cyber hygiene, it's security checked by a system which is under the control of the network operators who also have an obligation to play.  So, that's where the global comes into it.  And, it does make it more complex and it would be going in the wrong direction if we were trying to do it on a purely national or regional level. 

>> ARTHUR RIZER:  Thank you, sir. 

>> MARTEEN BOTTERMAN:  This gives me an opportunity to say to

Pearse, I agree with you.


>> MARTEEN BOTTERMAN:  Like with the botnets, we know that also network providers can check the (?) of their work for how safe they are.  But to honor your question on transparency, a multibillion dollar companies, how do we make that work, I guess that is one of the big questions of this time.  And, I think there is two elements in that.  One is how can I, as a small business, or as a citizen, an end user, get the knowledge to understand how it works, how my data are actually dealt with, better it does what it says it does.  I need to have access to resource that makes it possible and gives me that reliable answer on that.

Very much speaking for myself, although exploring this with multiple people, I think there is something in it.  Maybe we need to get the kind of accountancy for where we can go for checking on algorithms, how to use software, how this works out.

The second element of this is these multibillion dollar companies do not want to release their source code easily.  Frankly, I don't know how to crack that, because where do you find -- how can you set up capacity that is able to check into the source code that is trusted by this companies well enough to do what it needs to do that's checking the code and not sharing it with other parties.  How do you protect IP in such an in environment where you do want to be able to look into systems deeply to make sure that they're not breaking the law themselves.

Answers welcome. 

>> ARTHUR RIZER:  Thank you, sir. 

>> TATIANA TROPINA:  Thank you very much.  So, the keyword in your question was, it sounds like I want us to give up.  Well, I do want us to give up.  I want us to give up not on regulation or IoT safety, I want us to give up on generalization, like industry, or regulation, or IoT regulation.  I want to us give up on unrealistic expectations from industry, again generalization and consumers, and I want to us get real, because I do believe that regulation could save the day if we apply it correctly and if we can enforce it.  So, we have to think who, where, how, how realistic and how to enforce.  And, if anything of this, if all this analysis will show, that regulating particular sector, because if we would have been talking about self-driving cars, I would be the one standing up for regulation immediately, immediately right at that point, but when we are talking about industry and IoT devices, I am lost here in this generalization.  So, we have to think which industry.  What kind of devices?  What would be the balance between regulatory intervention and consumer voice and industry choice and whatever?  So, I want us to give up on the general debate.  I want to us get real and see where we can really predict the problem and fix it before it appears.  I want self-driving cars to be regulated before they appear on the road, you know, in front of my house.  That's what I want.

Thank you. 

>> All right.  I was asked a strange question.  Do people really have a choice.  Maybe they have to be on Facebook or they have to have a smart phone.  Well, as somebody who is not on Facebook, and as somebody whose mother doesn't have a smart phone, I think there is a very simple answer to that question.

I think the real thrust of the committee was something different.  It was, you know, about this role of individual rational choice in the overall ecosystem, and my answer to that is yes, people do have choices, they will make choices, and your entire system, whatever system of regulation or non-regulation you come up with will be based, will be driven by the choices that individuals make based largely on calculations, more or less rational of self-interest.  So, that you can impose these regulations here and that means that people over there in some completely different area will see opportunities, will see arbitrage opportunities, will see work around.  Their activities will be driven by choices based on optimize go their self-interest.  And, this is what you have to take into account when you're talking about regulation.  Regulation is not God, a hand that comes down and correct whatever goes wrong in society without having any attachment to special interests or miss calculations that are endogenous to the society.

So, that is what I think we have to bear in mind as we go into this debate. 

>> ARTHUR RIZER:  Excellent.  Thank you so much, Professor.


>> ARTHUR VAN DER WEES:  So, for debate purposes, wow, giving up on regulation, I'm really shocked.  And, I actually agree that we need to stop with having technical focused regulation, or that we focus on industry focused and sector focus regulation.  We need to have a human centric focus, and that's the GDPR for instance, and a data centric focus.

In health care, last year or this year people have died because of simple rents and (?) in Europe.  But last year, it happened, as well.  And, it was the same attack.  The year before, as well.  This is health care.  I think it's very important.  Is it hyperconnectivity already?  IoT.  Could be connectivity, right, but we don't know.

So, how smart is it?  Human centric and data centric regulation here is to protect the people that don't have a free choice, and this if they do, they probably are not able to make informed decision, because apparently, we assume that everybody can make an informed decision, which I don't think is possible.  Secondly, how can you be informed if you don't mow what is going on and you don't have, indeed, the transparency the product or the service or the system that you are using.  So, I think here, again, this is absolutely, I agree that we should forget about the discussion we had for decades, and certain industry that doesn't want to be regulated, because hyperconnectivity, IoT, is about hyperconnectivity of people, and that is the great thing about IoT.  That is why I love it, and I will be staying here but also anywhere to try to help out solving it.  I want to give one example of a data centric and human centric regulation that you will love.  Actually, a regulation article that you will like a lot.  That is article 3332, sorry, 32, 3-2 of the GDPR is state of the art security.  State of the art security is actually in the law.  How good is that?  State of the art is what it is today, and state of the art is what it is next year.  The very well-chosen principle.  Then there are two (?) you show you are by law mandate, by mandatory state of the art, there are two tradeoffs.  Cost of implementation, if the -- so, this is just too damn expensive to do this state of the art, you can go down a bit, you need to be informed, document, you need to show why you did it, second tradeoff could be the purpose.  Kite be just a cuddly toy that doesn't do too much, but there is a market impact.  So, state of the art mine does cost of implementation, minus purpose, plus impact.

I will give you an example of last year of Kayla, very relevant, happened, that is the connected toy, and even though it's only a toy, the impact is very high, because you can pair it from 30 meters.  There is no security.  It can talk to your kids.  It can talk about stuff you want, and it gives all kind of advertisement to -- the doll is still not off the markets in Europe.  So, everybody knows it.  We have the fulfill report and look at the Norwegian consumer association, very interesting stuff, and still on the market, and it's unbelievable.  Now, finally, last week the (?) that is the French GDPA data protection authority sent out a notice that they need to step up security and privacy.  Wow.  So, first this needs to happen and then people need to die first to get into this?  I don't think so, and this is worth debating.  So, come on. 

>> ARTHUR RIZER:  Thank you, sir.  Just getting good. 

>> That is what the plan was, right? 

>> ARTHUR RIZER:  This is a debate.  We're going to open it up to questions.  We have 20 minutes. 

>> I'm Martin (?) European group NGO.  I just wanted to underline one point is that any time that you want to regulate an industry usually, there is this argument that they're going to go bankrupt.  This is very standard, and you see this all the time, and it was an argument about the automotive industry when they tried to regulate it for security.

This idea that there is ever going to be a free market I think is an illusion.  I think that that most people would agree with this, because, you know, if I want to sell cocaine for instance, the State is going to prevent me from doing that, even though it clearly is a market and a demand for it.


>> So any time you have a regulation, you know, it just destroys a part of the free market (Audio pause)

>> University college London and I'm part of a large research hub that looks at standards, governance and policy on the Internet of Things, and we've been working on this for a couple of years now and I just wanted to make some observations.

The first one is obviously the title for this session is the wrong -- asking the wrong question, because this is not about whether the ITO should be regulated T. IoT already is in large part regulated by all kinds of regulations, byproduct regulations, by data protection regulation, by health care regulation, by transport regulation.  There is many, many regulatory regimes that are already in place.  The question is what is knew about the IoT that may need additional regulation, and also the point that regulation is not the only tool in the government's toolbox.  It's just simply one much.


The second point I wanted to make is that in the research we

have done so far, we have seen no significant market drivers for

IoT security.  In fact, quite the opposite.  It is as Arthur's

set, it's a rush to market security.  Very low down in the list of

priorities, and I would also say that surely after 25 years of

increasing cyber and security we're not still arguing that the

market will fix this.  I mean, I this I that conversation has been


The fine allege point that I wanted to make is the dimension we haven't touched on here yet is this is not all about consumer devices.  A big part of the sign security IoT is that we are rapidly connecting in secure devices that reduce the integrity of the system itself of the networks, so massively connecting insecure items that each one of them are entry point and that itself is sub security for the IoT.

Thank you. 

>> Marteen? 

>> MARTEEN BOTTERMAN:  On the last point, yes, you should be more conscious about the specific criticality of devices, and off applications, because sometimes a device is something different in one application than the other.  A hundred percent on board if (?) that is why I think taxonomy is important.  In creation taxonomy will greatly help progress that, and so far, I have not seen anybody willing to invest in starting that creation, which we all know cannot be done by one party alone. 

>> ARTHUR RIZER:  Go ahead. 

>> TATIANA TROPINA:  Thank you very much.  I want to say something about the last point.  I was trying to point on this, but no one followed up, that there are basically two ways to approach this IoT regulation from a consumer perspective and network safety and I believe there will be two different toolboxes because use be con super safety to ensure network safety would be like using the hammer when you need a drill.  And, if we think about network safety we have to approach it from a different perspective.  And, this is also, I believe, where the cross-border component and enforcement would become incredibly critical, and this is where the industry would have to be mobilized.  Regulation or incentives, I don't know, but otherwise it will just simply not solve the problem.

Thank you. 

>> Panelist:  Just to be as reverse provocative as Madaline was, we have having a conversation about which the market is to be not interfered with, and which it is.  You can't deny that that conversation is going on, and I don't think it is settled, as we pointed out at the very beginning of the debate the people invoking regulation are invoking God, they are invoking a mystical force that will come down and fix anything, and I didn't hear anything from you that was different from that.  It's just like the market has failed, let have regulation.  And, what does that mean exactly?  I mean, let's take as an example the Mirai botnet.  That was what really promoted this big push for regulation.  But, I mean, some of the biggest breaches of cybersecurity have come from government, right?  Either you're talking about breaches from the office of personnel management in the U.S., or you're talking about active offensive breaches coming from military agencies, so how is it that suddenly the market is tired as being inherently in secure and Governments are these responsible actors.  I think we have to talk in more nuanced weigh about interaction between market forces and government and, again t idea of liability, something that puts market forces into play in bringing about security, as does cyber insurance, which is another something that is evolving and starting to make play. 

>> ARTHUR RIZER:  What we're going to do, is I have saw several people with their hands up, I'm going on have you ask your question all at once and then we will go until time runs out.

The first is I have a question online, I'm going to paraphrase because I didn't exactly understand it, but what is the bare minimum of just, what is the consequences of just for devices of just insuring that individuals have to understand the security implications of it?  And, in essence, they just come dry and people have to be responsible for, you know, the way they used to work in the old days.  We had to download malware services and things like that.

And then, sir, you had a question? 

>> Audience:  Yeah, just very quickly.  I don't think it's an issue of trying to solve the whole problem.  As Madeline pointed out there is all sorts of regulations and standards that kick in right now with IoT devices.  If you look at one of them, Wi-Fi devices are ubiquitous, they have standards and limitations and regulation about how much they can transmit.  Yes, you can import one from somewhere else that would violate the rules, but most of them are not, and in general we have Wi-Fi networks that are not killed by things transmitting too loud or against standards.  So, you may not fix everything perfectly, but we can put things together that will make it better.

Thank you. 

>> ARTHUR RIZER:  Great.  Thank you. 


>> (Speaker not audible)

>> ARTHUR RIZER:  Thank you.  Ma'am, sir, and then, sir. 

>> Audience:  Tereza here from the Oxford Internet currently.  I have a more concrete question, I think.  There was a software mentioned for checking the surf codes of companies, which implied a very good point about needing such a good instrument regulation that the companies actually believe in them, and are willing to open their source code to their surveillance, basically.  But, I was just thinking about how do we, or against what we measure the wrongdoings in their source code since we have no source code like made to represent the ideal what are the wrongdoings, actually, how do we find them?  How do we explain it to the software that is checking the source code that the software never saw before? 

>> ARTHUR RIZER:  That is a really good question.  Sir, real quick, and then Sir, real quick. 

>> Audience:  Thank you.  It's not the question, it's


>> ARTHUR RIZER:  Very briefly, sir. 

>> Audience:  That is why I expressed I am really Sad that we didn't have time to participate in this debate.  It is a debate between you, but not with us.  So, yes.  Okay.

So, I think that speaking about regulation of the Internet of Things doesn't mean anything.  Let's say it means so many things that people will not understand it in the same way.  So, I think it is confusing.  It is not that way that we have to do it.

First of all, let's define the word regulation.  In French there are two words that are translated as regulation in English.  Regulamor and Regulacion.  And the meaning of Regulacion has more control, meaning, than the Regulamor.  I will not go in this field, but I would say that when you speak about regulating in terms of things, you want to regulate the industry of the sensors, the industry of actuators, the applications, what do you want to regulate?  This is not the way we have to address it.  I think the only thing that we need to emphasis on the regulation is the data collection and data use.  This is the main issue that we have to address, because it's very dangerous now with what we have now we can collect everything about you for your relation, et cetera, but in the future, we will know where you are going, your car is going from this place to the other place, so your personal data are collected and are used.  This is what we have to regulate. 

>> ARTHUR RIZER:  Thank you, sir.  Sir, do you have one last comment or question? 

>> Audience:  Thank you, yes, this is a question.  We heard that security had no incentives to be pushed so far in the toaster may be right, in the fridge, et cetera, but what I can see is (?) association is that with ongoing Internet of Things in the industry, there is a lot of concern about security.  It's more than IoT gets into production and to services and to all the value chains of the industry.  Security is really addressed, and my question would be should we not give the industry the benefit of the doubt before we regulate and hindering industry in developing the benefit of the doubt that it will address security question, let's say, adequately. 

I don't say absolutely but adequately. 

>> ARTHUR RIZER:  Okay.  I'm going to turn over to the panel to fix all the problems that were presented in six minutes.


>> ARTHUR RIZER:  I want you to know there is a paper, we have several copies up here if people want to one.  You can you will to website, but you can have this one copy if you want it.  And, it is Marcus versus mandate solutions for securing the Internet of Things.  It is a work joints work of passion between Georgia tech and the (?) institute and you can to either one of those websites and Mick autopsy copy or come to me afterwards and I'll make sure you get the website for it.

Thank you. 

>> TATIANA TROPINA:  Can I go first, because I have a short comment about the question about secure software code? 

This is always my problem with regulation, because the first thing about regulation is that you have to provide regulatory certainty.  And, when we don't know what is exactly safe, what is actually safe, what are the safe code, it influences like most important principle regulation for me is for a lawyer how certain our regulation is and who defines the standards.  And, I don't have answer to these questions, because I don't know how to solve it.

Thank you. 

>> Panelist:  I think the European Commission you just mentioned the education scheme actually what you mentioned, sir, you on that be giving the market the chance to not on the industry t market, not stakeholder.  I think that is what they are doing.  So, to have within their, say, regulation, their code of conduct and education schemes.  They can also work with and work them, so that is the woman few months they're going to look at it.  So, have a look at it.

On your robotics topic, well 400 years ago we didn't have a legal entity in the world and only individuals and now we are very useful and limited and foundations and institutes.  So, I think we will definitely later on seaman dated because in a company organization is also a mandated legal entity.  We will definitely see robots that get the mandate from either you yourself, you are in control, including the data, and the safety.  Or other organizations, we will definitely see it.  I hope we will not see it too quickly in the mill late tar riceed IoT which I'm a little bit concerned about.

On your point here, here in front, what I would like to add you talk about data, and I agree, of course, but safety is something new in IoT.  Security now is safety.  So, people’s lives are here at stake, and I think that is something that is typically a task for governance, institutions, preferably globally to take care of, because if they don't, people will sue the member state and organizations, and I will definitely do so when necessary, because I think it is absolutely unfair.

Thank you. 

>> Panelist:  Yeah.  I'll just to agree with that.  I think, again, this is the right time to have these kinds of discussions.  It is not clear all the answers are here.  I think I've heard a lot of things that also make sense to me, and we're getting into a direction of how we find solutions, and with that said, without going into the individual questions, I think for the sake of time, I think I will repeat my challenge to you and all of you take it back to your institutions, your organizations.  Do try to find a way to really get to a global taxonomy on security, on privacy, data use, and on safety. 

>> ARTHUR RIZER:  Okay.  Because we started late because the last panel, I'm going to have one last question, because you raised your hand for about 15 minutes.  So, do you want to ask one last question?  Yes. 

>> Audience:  Yeah.  Actually, following the gentleman's recommendation, instead of the word regulation, can we just say about IoT governance, I think that will bring more stakeholders into your discussion. 

This is No. 1.

No. 2, I am from Bangladesh, which is a developing Country, so in Bangladesh U.S. Internet is a very hot discussion, and technology also, but yes, IoT is not in the discussion practice.  So, I mean, whoever ultimately is trying to suggest that IoT governance or the regulation, how basically the developing Country is coming in this scenario, is there any kind of consideration and planning on that? 

>> Panelist:  I think there is.  I think IoT is not the right term.  We talked about I think already a bit.  Perhaps smart society is also not perfect, but I think a bit better because it goes more to any society, any part of the world where we see very nice initiatives all around the world, as one, and we see also very interesting social development goals even on data, on digital data that we are sharing data around the globe, that is also by the way initiative by the United Nations.  So, I think IoT is a little bit more too technical, I think about sensors, I think of course how to apply it and how to for instance, make a smart home sustainable, generates its own energy, smart grid with the neighbors, those things can be done in a frugal way instead of a extensive way.

Your point on the title, the title was --

>> ARTHUR RIZER:  We had to do something to get you guys to come the goal to get you to come here and disagree with our title. 

>> Panelist:  We play the game, but we are not all in favor or against, and that is why we do this.  It is not be about regulation, or even governance, it is more organizational, it is even more fluid than this.  I agree with you. 

>> ARTHUR RIZER:  Thank you all for coming.  We really appreciate it.  I'm going to be outside and I'm sure some of the panelists as well.

Thank you very much.


(Panel concluded)