IGF 2022 Day 2 Open Forum #107 Technology and Human Rights Due Diligence at the UN

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> PEGGY HICKS: Welcome, everybody. We're going to start the session. Grateful to be with all of you. Thank you for joining us online and in‑person. I'm Peggy Hicks, of the UN Human Rights Office in Geneva. But I see with us Peter Micek from Access Now. Grateful for your support as well from the European external action service. And all who contributed to this session.

A brief overview of what we're trying to accomplish today. We're talking about a document that's been in the works for a while, a very necessary document. One that looks at how the UN itself brings human rights due diligence into our approach to digital technologies. This is, of course, crucial, because not only as an international organization active across the globe in many different spheres, it's an important thing for the UN to get right. Hopefully, we'll be able to set down some guidance that would be useful to others in this field as well.

So this is an interagency process that's been underway for some time, and we're now in I think the third draft that we're going to be talking about today of the guidance.

But we're nearing the close, and really saw this opportunity at the IGF as a place to make sure that we're getting the types of input on this, and to make really sort of a public engagement around this document, that is so important.

So what we'll do today is we're going to provide a summary of some of the comments that we've received from shareholders on this latest version 3 of the human rights due diligence document for the UN. We're going to share the views from our end, the UN Human Rights Office on those comments. And we want to have an open discussion both on the document and the comments that are there, and we'll also update everybody on where it's heading.

Where we have with us today online, Catie Shavin, consultant who has been working with our office on this and the lead drafter, along with Scott, who leads our digital rights and tech work at OHCHR. And Nicholas Oakeshott, senior policy ‑‑ and we're hoping to have Victor Kapiyo, from KICKTANet, a multi‑stakeholder think tank. And finally, the UN's tech envoy will join us later in the session and make some comments near the end of the session.

So, very happy to have you all with us. And with that, I'm going to hand over to Scott.

>> SCOTT CAMPBELL: Great. Thanks so much, Peggy. And hope you're hearing me OK in the room there. So I'll take that as a green light.

But thanks to everybody for pretending it's clear that with the number of parallel sessions going on at the IGF, it's really challenging to prioritize.

Very happy that we're having this next public consultation and a multi‑stakeholder consultation on the draft guidance.

I really want to thank all of those that have been involved in giving us very constructive and thoughtful feedback to date throughout the various consultations that we've had.

Peggy, as you noted, it has been an extended process, and in fact, I did want it say just a couple words about the process. The first thing is just that it's been very useful, on a number of fronts. And I think in particular, building mutual understanding among really a diverse group of actors within the UN and outside the UN, on what human rights due diligence actually means and applying that to how the UN is using digital tech. What that means in practice as well.

We as the UN Human Rights Office have learned enormously in going through this process and hope it has been a mutual learning process for those involved. Linked with that is our determination to make this guidance really practical, useful and used.

We're going through this extended process to make sure that people understand it, there's buy‑in, and implementing within the UN mandates which are of course extremely varied across the UN system.

We have taken into account the written comments received in September and October on our third draft. Thanks again to all who have contributed. We'll share thoughts. I'll hand over to Catie in just a minute for her to give a summary of the feedback that we have heard, and how we're considering incorporating that in the next version.

We will subsequently share with you a fourth version for your consideration as well. We're very happy, as we have been along the way and have done to enter into further consultations as need be.

While we do hope to bring this process to a close in the first part of next year in terms of the specific timing and endorsement bodies, we will ultimately share a revised version of the draft guidance with the executive committee of the Secretary General, which is the senior‑most internal‑facing committee under the Secretary General. And we'll submit it to that body for endorsement, and then the Secretary General may ultimately decide to share it with the chief executive board of the United Nations for their endorsement.

The exact timing of when we share with the executive committee of the Secretary Generals is tied to a parallel internal process that is ongoing and also relates to human rights due diligence. And, in short, this is a process whereby the Secretariat is looking at the existing human rights due diligence policy, which applies more narrowly to UN support to non‑UN security forces, some which has been in implementation phase for over a decade now. And that process is currently under review to see how that policy might be expanded more broadly across the UN.

So our process, of course, which looks at more specific guidance on human rights due diligence in the UN's use of technology, will essentially build on that broader policy that will hopefully be forthcoming after a meeting of the executive committee in the first quarter of next year.

But just to assure people, and this is a question that has come up, the development of this guidance and the development of that broader policy are very much aligned both in substance and in timing.

Just maybe a last note, we envision an extensive roll‑out office, engaging entities in the field on how exactly the guidance can be implemented. Doing webinars, developing tools and resources to facilitate the implementation.

So, Peggy, with that, I'll hand it back to you and look forward to the discussion.

>> PEGGY HICKS: Thanks very much, Scott. And that's a great overview of where we've been and where we're headed. And now we're going to turn directly to Catie, there with you, for an overview of the feedback received, and how we're incorporating those comments so far, and of the document itself and the general sense.

>> CATIE SHAVIN: Thank you very much, Peggy, it's great to be here today and great to have an opportunity to present some observations on the feedback we have received on the third draft of the due diligence guidance. As Scott mentioned, I'm supporting this as a consultant to the UN Human Rights Office.

To give you some context in analyzing from those who have reviewed the guidance, I'm a business and human rights specialist, a lawyer providing pro bono support as they developed the UN guiding principles on business and human rights.

But for the past ten years I have worked with business practitioners supporting peer learning how to implement in complex organizations.

Before I dive into the feedback we received on the third draft, I want to provide just a bit of information about kind of what the guidance looks like for those who aren't familiar with it. Very briefly, it's a 17‑page document that includes an instruction that provides high level information about the guidance including why it has been developed, who it's for, and how to use it.

It then introduces human rights due diligence for digital technology use, what that is, and why UN entities should do it.

The main substance of the guidance is then set out in a more detailed third section that addresses practical approaches to implementing human rights due diligence for digital technology use, including embedding human rights due diligence across an organization. Steps to identify and assess impacts.

The action that should be taken, where potential or actual human rights impacts are identified, and then processes to track progress and effectiveness, as well as to communicate about the organization's approach.

The guidance then concludes with frequently asked questions section that addresses some common questions about the guidance, picking up on some of the questions we're getting through this review process. And provides links to further resources that may be helpful to you and entities as they embark on implementing it.

Turning to the feedback that we received, we received comments from 15 stakeholders on this third draft, mainly from within the UN system, but also from other stakeholder groups.

I think it's probably fair to say that we received less feedback on this draft than the previous one, and that we saw generally positive reactions to the direction the guidance is currently heading in. I think it's important to emphasize the feedback was incredibly constructive, thoughtful and welcome, and has been hugely helpful as we start work on the fourth draft.

There are several substantive areas of feedback I would like to share with you today, as you might imagine, we received a wide range of comments. But I wanted to really highlight and focus in on the key things.

Firstly, and to begin on a positive note, a number of stakeholders welcomed the direction the draft is now taking and really emphasize that it's helpful that the draft itself emphasizes the iterative nature of human rights due diligence and the need for it to be tied to the context of designing a process that works for them.

And also the sort of plainness in accessibility of the language in reducing barriers to engaging with the content in the guidance.

We hope this takes a lot of the anxiety out of discussions internally within different entities and really hope focus minds on what operationalizing it might look like instead.

When developing the third draft, we really thought about how to frame the guidance, based on some of the feedback we heard on the second draft. It was positive to hear for the most part that approach resonated with those who reviewed it.

Secondly, structure. A number of reviewers commented that some of the introductory material felt quite repetitive and that some key concepts included in the body of the guidance would have been much more helpful earlier on.

So, for example, we received a lot of comments in the earlier sections of the guidance that highlighted questions or confusion about the expected scope of entities, human rights due diligence for digital technologies.

That feedback has been very helpful, and we're now looking at some changes to the structure that we can make to ensure that kind of key information about the human rights implications of digital technology use, the scope of human rights due diligence come much earlier on.

And also that we communicate very, very clearly that the guidance encompasses the full cycle of digital technology use, starting with conception and design.

We're also working on some options to streamline some of the introductory content and help people get into the meat of the guidance faster. Thirdly, we received a lot of feedback on the second draft about the need to better align the guidance on prioritization with the approach taken in the UN's existing human rights due diligence policy, which focuses on grave violations. And the feedback was directed to ensuring some consistency with existing policies in this guidance.

In the third draft we sought to build on that feedback and include guidance on prioritization that both focuses on severity and aligns with international standards like the UN GPs, but also incorporates a minimum threshold focused on grave violations to align with these existing policies.

Helpful feedback from one entity in particular, that on reflection it might be more straightforward to simply focus in this guidance on a risk‑based approach to prioritization focused on severity. And then leave entities to ensure that they are doing it in a way that aligns with other relevant policies that they need to comply with, including the existing human rights due diligence policy. I think this is likely to be reflected in the next draft.

Fourth, use of leverage. We also received helpful feedback that the guidance would benefit from practical examples of what using leverage to encourage third parties to act would really look like in practice. This is something that we shied away from in the third draft, mainly because I was conscious that it was already becoming significantly longer than the second draft, and I wanted to avoid it getting any longer.

But reflecting on the feedback we've received, we agree it could be really helpful to include some of those examples, and certainly from my experience working with business and other organizations, grappling with human rights due diligence, I know that the use of leverage can be one of the more challenging areas of human rights due diligence for organizations to get their heads around, particularly when going beyond traditional commercial leverage.

So we're looking at ways to do this at a useful, practical, and also don't overcomplicate the guidance.

We asked stakeholder for input on including illustrative examples throughout the guidance, value of that. And suggestions or requests in terms of the types of scenarios or types of digital technology use or digital technology human rights risks that would be helpful to address.

All stakeholders agreed examples would be helpful and some offered concrete suggestions which will help us to flesh out some examples that really resonate with the types of issues and challenges that we understand entities are grappling with right now.

And then finally, carve‑outs. In relation to the second draft, we received a number of requests for carve‑outs, so effectively, exemptions from having to do prior human rights due diligence.

Concerned about the implications for very important work of potentially onerous due diligence process. In the third draft, we took care to really emphasize the flexibility that entities have to tailor how they do human rights due diligence to ensure it works with their activities and the context in which they operate.

That I think resonated with at least some of the stakeholders who expressed such concerns, and we received feedback that the approach, the guidance now takes, has been helpful in alleviating some worry about what the kind of practical implications of it might be for their work. We're continuing to explore these concerns with the stakeholders.

And also to look at options to develop a worked example that illustrates how human rights due diligence might be implemented in relation to urgent or lifesaving work in a way that navigates those challenges in a rights‑respective way.

Those were the six key areas of feedback I wanted to share with you today. And as I went along I flagged some of our current thoughts on how those areas of feedback might be addressed in the next draft.

As I mentioned, and I really want to emphasize this, the feedback has been invaluable, and we're so grateful to those who took the time to share thoughts with us. Some of whom really invested effort in extensive internal consultation and socialization processes within their entities to be able to give us the most helpful steer for the next draft.

When we talk ‑‑ when we developed the third draft, we took stock of some of the helpful but challenging feedback and really rethought how we approached it. And the third draft looks very different from the second draft. This time the feedback was generally supportive of the approach that we've taken and really looking at how we can refine it and make sure it's as helpful to entities as possible.

Accordingly, I think that the fourth draft will build on the third, but is unlikely to be radically or significantly different. That's probably quite a lot for me to get this conversation started. So I'll hand back to Peggy. But I really welcome any questions or comments when we open up the discussion. As well as additional feedback or suggestions which we will take into consideration as we prepare the next draft.

>> PEGGY HICKS: Great. Thank you very much, Catie. I guess it's a good moment for me to also very much thank Catie, whose service and work on this has been invaluable. As I'm sure you can hear from her comments, has approached it with a wealth of expertise and experience.

And most, even potentially more importantly, an attitude of problem solving and really listening and incorporating the valuable feedback we've gotten. Thank you so much Catie and Scott for your valuable work on this.

We wanted to have a couple general comments going forward and then open it up for questions and comments from all of you. I'm very grateful that I've been joined by Victor now and really looking forward to his work.

As I said earlier he works at KICKTANet. A multi‑stakeholder think tank. Welcome your thoughts on this initiative, Victor.

>> Victor Kapiyo: Yes. Thank you very much for the opportunity and I want to thank the team that has worked on the document and drafts so far to incorporate had views of stakeholders. I believe it's very important that institutions such as the UN is starting to take action.

We have been focusing a lot on companies, but we had started looking inside to see what more could be done more from the single largest human rights institution. It's important to start leading by example.

Just in terms of quick response to some comments to the draft, I think. So far, it's ‑‑ it has made good progress from what has been said. I think from some issues, just for me is one, is how can we, in terms of engagement with suppliers, implement in terms of engagement with suppliers.

Is it possible to have stronger requirements of supplies in terms of demonstrating their human rights compliance. In terms of whether they're sending the reports and things like that as part of the process.

How can we put in place measures to look at how to assess the unintended consequences and effects of technology? Sometimes in the assessment, we might have been thinking this way, but things change or circumstances change that create impact that were not intended.

And also, just from the continent, we also are looking at how the technology impacts the groups at risk who are already at risk. So perhaps the guidance could elaborate a bit more, the special groups, if you're looking at children, specific categories of groups that are adversely affected. Perhaps it would be useful to have some more consideration.

And then we have the aspect of reporting how can we elaborate more on the communication aspect. Because as Civil Society we would like to engage and see how the UN is implementing the measures, but what measures could be put in place to require the various agencies to collect data, statistics and provide information on the types of technology that they're adopting even.

The suppliers that they're engaging with. So that we can also track, we can't monitor progress at the UN if we don't know what they're using and how they're using that. So I think an elaboration on that would be useful.

I know it was mentioned there's going to be a series of webinars and things like that. Perhaps also within the guidance to help the various agencies, to also start taking measures to prepare as a fast process to educates their teams on why this is important and why it's important to do it over and above the document.

And I think lastly is that we know that there are various types of technologies that are already in use. Perhaps an elaboration on which technologies are these that perhaps the institutions will start thinking about more clearly about how we're using artificial intelligence, biometrics and so on and so forth. Perhaps more elaboration on the dangers that these technologies pose. Thank you.

>> PEGGY HICKS: Great, thank you so much, Victor. A wealth of really helpful insights and comments there, and I love the way that you started off by noting that it's one of the things I think we within these discussions, OAs need to look at is the multiple actors in the space and how they all have human rights responsibilities of different kinds.

So we do focus a lot, as you said on the companies, the UN itself has roles in it. And then, of course, there's the state role both within the UN guiding principles and to fulfill human rights obligations in this space. So we ought to be asking the same questions of states, about their human rights' due diligence processes for their own use of digital technologies. We hope the conversation and broad ranged engagement around it will help us set some good practices in place for it. I appreciated all your comments, but wanted to pick up on the note about this tension between trying to make sure that what happens here is seen as a process in which people want to invest time, because people are very busy, and they need to do the jobs that are so crucial.

All UN partners have that in mind, and wanting to make sure we're communicating effectively about how this can be done efficiently, but also why it ultimately delivers value. Why it's not an additional hurdle to get over, but something that can help with delivery of effective programming in a variety of ways.

So we have one other commentator we wanted to bring in and then I'll open it to comments from the floor as well. I know we have a couple lined up already. And then obviously, ultimately want to go back to Catie and Scott for some reflections too.

But I'd like to turn the floor next to Nicholas Oakeshott, from UNHCR. Really looking forward to your thoughts.

>> Nicholas Oakeshott: Many thanks. I would like to start out by saying how appreciated the thought and care gone into developing the guidance thus far is from our side. And we welcome the careful consideration that's been given to our comments. We think the third draft is well on the way to providing what Scott and Catie identified, or Peggy, at the top of the important practical and implementable guidance within the UN.

Let me start by saying why this guidance is particularly important for UNHCR. It aligns with key elements of our new digital transformation strategy, which runs from this year until 2026. This strategy is different from many other digital transformation strategies, because it's focused primarily on transforming the digital lives of the people we serve in a rights protective enhancing way.

As well as UNHCR's own organization, or transformation, can support that overall objective. The strategy contains important commitments on ethics, what we're terming digital protection and strengthening digital mechanisms for our own accountability to affected populations.

For example, the section on digital protection includes an important commitment, mainly that UNHCR's own use of digital technology will continue to increase protection and align with international human rights and ethical standards, and that these standards will also be promoted with states and the private sector with a focus on high-risk technologies, uses and context.

So we see the finalization of future implementation of the guidance alongside other developing UN internal standards such as the ethical principles on the use of artificial intelligence in the UN system, which were adopted by the chief executives' board in September of this year, can provide an important framework to help realize those strategic commitments contained within our strategy.

And this lines up well, we think with broader UN organizational commitments within the road map of digital cooperation. The Secretary General call to action on human rights.

And turning now, if I may to some of the challenges that we see in respect to the guidance, its development and implementation and how to address them. UNHCR is a humanitarian organization with protection of refugees and (?) at the core of mandate. It works in the challenging context of conflict and contained resources.

Varying capacity and willingness of host data to protect the forcibly displaced. If UNHCR were a business, these sorts of situations would be considered as meriting enhanced human rights due diligence.

Indeed protection risk assessments are integral to UNHCR's work and it has extensive risk management policies and procedures in a wide range of areas related to digital tech. For example, data protection, procurement, digital technology partnerships. However, human rights due diligence itself is comparatively new to the organization.

As a result in our responses to the drafts, we've carefully analyzed how the guidance can strengthen UNHCR's existing approaches, to quote from the human rights council, in the conception, use, design, and further deployment of digital technology.

At the same time, we're concerned to ensure the guidance doesn't inadvertently impede the delivery of life saving humanitarian assistance. This point that Catie has flagged up.

What does this mean in practice? Concerns, for example, whether a human rights impact assessment will be required for the use of digital technology prior to every emergency response with the risk that it would slow down its delivery of life saving humanitarian assistance.

UNHCR may face stark choices between using a particular technology or working with a particular partner or supplier on the one hand, and on the other, not delivering that lifesaving humanitarian assistance?

So the final stages in the development and adoption of the guidance and address some of these remaining concerns in more detail.

The third draft, quoting here, "if the UN entity is new to human rights due diligence, steps to get started should include (?) research and conversations with internal or external stakeholders or an internal workshop to learn about key human rights risks and issues relating to the entities' digital technology use." With this in mind, UNHCR secured additional funding to undertake an internal simulation of the guidance and has formed a multifunctional stakeholder group team internally to prepare it and to support our engagement with the future development.

To learn from the private sector, we have ‑‑ we're grateful to have secured the expert support for the simulation from human rights due diligence experts at the responsible business team in the LA piper, a multinational law firm. This is part of a much wider partnership between UNHCR and DLA Piper over the next three years that was launched on world refugee day this year, which includes the extensive pro bono commitment and funding for refugee environmental protection fund.

In addition, we're looking to other UN experts on human rights due diligence, to see if they can support the simulation of the implementation of the guidance.

We hope that by looking at case studies in the challenging context in which UNHCR often works, to build our own capacity, provide recommendations on some of the difficult questions that I flagged up already. And also, to think about how we can best bring the communities we serve into the future implementation of the guidance. That's another key principle that's contained within our digital transformation strategy.

One final thought in closing is that I very much echo the thoughts of Catie and Scott. The process of the development of the guidance has been very helpful to UNHCR, because it's allowed us as an entity, to think through how human rights due diligence in relation to digital technology can help us to meet our strategic goals and strengthen our own risk management process.

I think we're aware at the start of this process, we very much welcome the opportunity that the guidance development has given to build momentum in this area. So thank you very much, and look forward to the further conversations and discussion.

I'll go back to the room.

>> PEGGY HICKS: Great, thank you so much, Nicholas. It's really encouraging to hear from UNHCR on this, both in terms of the foundation you're starting from in terms of the digital transformation strategy that you have in place and the provisions within it, that already reflect some of these needs and the work ongoing already with protection, risk assessments and other tools.

But then how this process can deepen and engage across the broader range of human rights issues and use human rights due diligence as a useful tool.

And in particular the effort that you've gone through to put in place the simulation, I think is really very exciting and looking forward to hearing how that works and what comes out of it going forward.

The extent to which this is a process and not an endpoint, and whatever we do in human rights due diligence is an iterative approach that we will always need to refine, based on what we need to learn from how it's working and the involvement of different groups and engagement. So really looking to all entities to play a role, but recognizing that will vary in different ways over time.

With that, I would love to turn to all of you listening as part of this conversation and gather some more comments and thoughts on the conversation that we've had so far.

And I have notes that we have two people in the room, I hope, who are happy to come in at the outset. I'd like to turn first to Susan from common cause (phonetic) Zambia.

And the microphone is coming to the back for you Susan.

>> Susan: Thank you very much, Peggy. Thank you very much for the opportunity. And also, thank you to the previous speakers for their presentations, which were quite comprehensive.

I just have two comments that I would like to make, rather, questions, actually. The first one is that we notice that most times, tech is deployed in conflict areas, and also areas that require desperate assistance.

So to what extent were marginalized groups in the global south, particularly women and marginalized groups, consulted in developing the guidance? Because most times, these are the people that are affected by this technology that is usually deployed.

And then my second question is to what extent some of the harmful human rights practices that we note are reflected in the guidance? For example, we are seeing more and more countries introducing things like digital IDs, collecting biometric information data from citizens. So to what extent is such reflected within the guidance? Thank you very much.

>> PEGGY HICKS: Great. Thanks, Susan. I think that second point goes back to the point Victor made as well about maybe unpacking some of the specific areas where we know there are ongoing issues. Maybe either as case studies or annexes in some way too.

I was also hoping we might be able to get Peter Micek from Access Now to come in with some thoughts?

>> Peter Micek: Thank you. We're really excited about this guidance and thanks for inviting us to participate throughout this process. I think Civil Society is really a key stakeholder, and often are going to be suffering the brunt of whether it's reprisals or misuse of technologies.

This guidance is really essential to the legitimacy, ongoing of the UN itself and the UN's work and the digital age. I think the best time to plant the seed was probably at least ten years ago. The second‑best time is today.

But honestly, we needed this 20 years ago. Some of these policies, especially around the use of biometrics has been in place since the early 2000s and that's coming from the top, from the security council. And the harms have been compounding since then where vulnerable and marginalized communities especially are forced without any meaningful legal basis. Whether it's consent or otherwise, to submit really sensitive information, often to third parties and vendors.

And that gets misused and reprocessed and used against them for literally decades.

So this guidance is too late, but we're trying to make sure that it's not too little. This will need an ecosystem to thrive, and to use a tree analogy, these roots need to be interlocking. This can't be standing alone and suffering the winds.

And linking to other processes, many of which have been mentioned. B‑tech (phonetic) studies around business models. Is that going to be played out and expressed in the guidance or in some of the examples. Looking at data protection compliance regimes. ICRC developed a really extensive regime for itself. But also procurement and screening processes, as Victor said, both in EU and Member States, what are best practices in screenings.

And there is mandatory human rights due diligence coming. The EU is about to pass some rules. How is this going to reflect, and really is this going to raise the standard or is this a floor of what we expect from agencies.

To wrap the tree analogy, it will need nourishment. This is ‑‑ Civil Society and Nongovernmental Organization and forming their guidance. This multi‑stakeholder team sounds internal. But the use of iris scanning technology on refugees seeking assistance in Jordan by HCRN, and some of their private vendors, they're not keeping up with some of the developing norms of the application of human rights to new technologies. We would say that listening to Civil Society is essential to ensure that ‑‑ and we agree, these norms are developing. The conversation wasn't the same in 2015 as it is now.

We want to contribute to that, and I will say that Access Now, just yesterday at IGF, launched a new declaration on content governance. Principles for social media platforms, engaging in times of crisis.

We believe that will apply to a lot of these humanitarian situations that are being discussed. But it is, as you said, a much broad were thing that will involve all UN agencies. And we're really standing by to support the urgent need for implementation here. Thank you.

>> PEGGY HICKS: Great, thanks, Peter. Thank you for a really compelling statement about why this process is overdue and much needed. And really looking forward to finding those avenues to make sure that the Civil Society, UN, that Susan and you both mentioned are the foundation for the work going forward. But noting as you said, that will require investment and resources, and to be frank, not 100% sure where those will come from at this stage. I do think that's something we need to all think about going forward.

I expect we will probably have some comments online and in the room. But I think it makes sense quickly to give, just so we don't compile too many comments before we get a chance to hear back, to see if Catie would like to come back in. Catie or Scott. And maybe Nicholas might want to say a word on the UNHCR point that was raised before we turn to other questions. Catie, do you want to make some initial reflections on the comments already received?

>> CATIE SHAVIN: Thank you very much, Peggy. Thank you to everyone who's made comments and asked questions. This has already been very helpful as we start to think about the next draft.

To address just a few of them. A number of the questions and comments got to the need to get really clear on, for example, how UN entities can implement this guidance in relation to their engagement with suppliers, which we know from the experience of business and others is complicated and takes time and a lot of learning by doing.

As well as to get into a lot of the data around the risks, the types of human rights risks associated with digital technology use and the ways that can affect groups already at risk of vulnerability and marginalization.

These are things we have thought about a lot and will continue to think about as we shape the next guidance. One of the things we're conscious of, and I think the last speaker put it very nicely when he talked about an ecosystem to thrive for this guidance.

That is, in order for it to be navigable and manageable, it can't do everything for everyone, because it would become incredibly long. So we're looking at what's already in place in the UN or external that we can leverage. There are some wonderful resources available that get into human rights impacts of various types of digital technology use. Rather than bringing all that data into this guidance, we're looking at ways we can connect users of the guidance to that material that's available elsewhere, while providing some illustrative examples that help users to visualize what it might look like.

Similarly, thinking about the extent to which we can give very concrete guidance and support to entities that are looking at needs more challenge ‑‑ those more challenging areas such as engagement with suppliers and the broader digital technology value change and how we can also flag that need to learn by doing, to work in collaboration with other organizations and to recognize for a lot of these challenges, there isn't a one size fits all approach.

We can provide a high level that will be applicable across different UN entities, but a certain amount of the work and thinking will also need to be ‑‑ so trying to find ways to help people navigate that without overwhelming them, but leveraging in smart ways, what's already out there, but then the material that UNHCR hopes to be able to supplement this guidance and provide further support.

I'm just looking at my list of questions to see if there's anything else helpful for me to elaborate on now. I think the other piece would be reporting and communication. Apologies for the background noise. I have a small person with me who's increasingly impatient.

We received a number of comments about the reporting and communication piece and the need to be more explicit. Also to help entities think through what is appropriate in terms of their all distinct from other business types or organizations, in providing that transparency. One of the things I think we have to be very clear about in the next version of the guidance is former reporting isn't necessarily required.

Formal reporting is not necessarily required. There are many ways to engage with stakeholders, and we want to encourage entities to explore what might be the most appropriate communication in what type of circumstances.

There's need to think about the feedback we've gotten about the importance of transparency for entities within the UN system and what the ‑‑

>> PEGGY HICKS: Appreciate those. Nicholas, I promised you a 30‑second intervention, because Peter had name checked you all on one issue. Would you like to come in on that?

>> NICHOLAS OAKESHOTT: Sure. Thanks for the opportunity, Peggy. Peter, and Susan, just two points in response. I think yes, well noted the role of Civil Society in enabling UNHCR to effectively implement the guidance when it's adopted. So I can look to try to put that on to our internal simulation agenda.

I think as we've been expressing, this is a process. And we need to ensure that there are appropriate levels of consensus and comfort within the organization.

And understanding of how we're progressing about something we can look to think about and talk about separately.

Personal data, particularly sensitive personal data, I think yes, this is something that is covered by existing policies in respect to the personal data of the people we serve that UNHCR has. The human rights due diligence and this guidance is relevant to that. We are taking extensive measures to consult more with the people we serving about these sorts of issues in the digital transformation strategy. We did an extensive multicounty, really quite varied stakeholder consultation about what digital services the people we serve want us to provide them in the future.

In short, they want us to provide more. And so we want to see how we can use that sort of end user, user experience research approach to really bring the people we serve into the codesign of the systems that we'll develop moving forward. So perhaps that's an area we can also discuss in the future.

Thanks a lot, Peggy, and back to you.

>> PEGGY HICKS: Great. Thanks very much, Nicholas. We're rapidly running out of time and very fortunate to have been joined by the UN envoy. We have several people in the room and online. Five minutes for additional comments. If everyone can keep it to one minute we'll try to get at least five of you in.

I saw a hand in the back there.

>> CATHERINE: Hi, thank you so much for a very interesting presentation about the work you're doing around the due diligence guidance on tech. I'm Catherine, from the Institute for Human Rights, and I work as the responsible value chains program manager and work within the area of human rights and business and technology.

I have a few comments. I'll try to keep it as brief as possible. First of all, to Catie, and hello Catie, we know each other from the past. Great to hear you're stating that you're looking at already existing tools and resources for human rights impact assessment, risk identification, et cetera, within the area of tech and how they could be applied by UN entities.

We have done a lot of work in that area, and happy to also discuss afterwards how we might be able to apply some of the work that we've been working on, also together with partners. And then just a comment to Peter's intervention that he raises a lot of very pertinent points, including the changing regulatory landscape. Those who are around the human rights due diligence directive, but also seeing a lot of regulatory developments around tech specifically, and a lot of discussion around how these tech policy developments at the EU level are linking, also in other countries, but especially at the EU level, how they're linking to broader efforts to support human rights due diligence of private entities, but also the state in which it meets its duty to respect human rights in the context of business related activities and impacts.

Just wanted to mention also when it comes to understanding the ecosystem and different roles and responsibilities of various actors within the digital ecosystem, the data sharing institute for human rights, together with BSR and BTECH have been working on developing an outline of the digital ecosystem under the tech for democracy coalition action on responsible technology.

That could hopefully provide inspiration, overview of different actors, roles and responsibilities that could also help inform this guidance.

>> PEGGY HICKS: That's four points now. [Laughing].

>> CATHERINE: Just finally, a small point around stakeholder engage and one of the key stakeholders, and I'm also joined here today by my sister in (?) From different countries. We have digital alliance of national human rights institutions, so national human rights institutions are also centrally placed to create that bridge.

>> PEGGY HICKS: Sorry. Just I see hands and online. So thank you, we're very glad to have the National Human Rights Commission people with us. So I want to go to Anna.

>> ANNA: We have a question about how data collection and risks associated with data collection, and a question about whether the document is available publicly.

>> PEGGY HICKS: I see John there. You want to come up?

>> JOHN: I just ‑‑ sorry. I have a question also. Relating to a point raised by CIDA (phonetic). In terms of the guidance you're developing, how transparent will the decision‑making be in terms of will you be providing publicly, information on, for example, how you arrive at decisions around software procurement?

I'm thinking, for example, particularly in relation to personal data and privacy.

>> PEGGY HICKS: Thanks very much. I'm sorry to be so rushed. I'm sure there are other comments. But I want to give the tech envoy a chance to make some remarks. So over to you. H.E. Amandeep Singh Gill.

>> H.E. AMANDEEP SINGH GILL: Thank you, Peggy, and thank you for giving me some moments to catch my breath. I'm sorry I missed the initial parts of the discussion. But I want to underline our support for this very important effort. This was a theme mentioned in the high-level panel in 2019 and then in the road map, and I'm happy to see the progress in terms of the draft due diligence.

But also happy to see that this is fitting in nicely with one of the key themes for the global digital compact ‑‑ protection of human rights online.

And also, which relates to this aspect of sharpening accountability for all those who may have responsibilities related to human rights. Whether they are aware of them or not.

These instruments, there are process issues in developing due diligence. Transparency was mentioned. The internal UN usage of these.

How we are able to communicate that better? How we are able to keep them updated? Because if it's just one short thing and then during the deployment of certain technologies, particularly data‑driven AI systems. Things change. What works in the sandbox may not work the same way when you take it to scale.

So we need to keep an eye on the lifecycle consequences, and this due diligence cannot therefore be just a static tool. It has to be a dynamic tool.

It has to also keep in mind the evolving regulatory and governance landscape, because, at the end of the day, UN entities operate in jurisdictions, legal jurisdictions where there are certain considerations related to governments, and data was mentioned by our colleague.

So I think that would be clearly important as well. One last thing I want to mention is that it's important to have these discussions in the digital context, but we should never give into the idea of it's still ‑‑ where we start to kind of go down this part of exceptionalism. This is a special place, so a special consideration applies. At the end of the day, we have to be a little fundamentalist about human rights. No matter what happens, they are paramount. Human rights online, offline, there are no special considerations in terms of responsibility that is there.

We may have certain issues in terms of how we implement human rights online and who's responsible, and how's responsibility linked back to the fundamental responsibility of states that have signed up to human rights governance.

But let's not give into exceptionalism, because there, when we are on kind of a slippery slope. So to conclude, really glad to see the progress that's being made and congratulations to our colleagues from the high commissioner on human rights.

Continue to work with you closely to make sure that this is done.

>> PEGGY HICKS: Thank you so much. It's such a pleasure to have the tech envoy in place and with such impressive support and expertise, that really adds so much to our work in this area. Having that high level engagement and support is really crucial to ensure this process delivers what we're looking for across the UN system as we've described.

We've largely run out of time. I'll beg your indulgence to be able to go back to Catie one more time, because we had some additional questions and want to give a chance to reflect on them. I know some others may have had comments or questions.

I didn't formally introduce you to Yoojin Kim, who has been with us and a moving force to putting this panel together. Thank you for that. But she's also working with the team in Geneva. So happy to stay after and take any additional thoughts and comments, and Catie and Scott are on standby.

Over to you, Catie, for any final remarks, please.

>> SCOTT CAMPBELL: Thanks, Peggy. I'm just going to jump in for Catie as she handles a childcare issue. But I just wanted to really thank the SDG's envoy on technology for those remarks that are really helpful in framing this for the bigger picture.

I know we're out of time, but I did just want to clarify that indeed the draft document version 3 is available to all, and has been circulated with all stakeholders through the roundtable 3A‑3B multi‑stakeholder group. I will post that again in the chat now.

Certainly don't have time to address all of the questions, but I would just note a cross‑cutting theme is that the Secretary General in mandating our office to carry out this piece of work was very clear in asking us to hold consultations with those most affected by the UN's use of digital technology. So we have been determined ‑‑ and shout‑out to Access Now, through a multi‑stakeholder process and through the vast network of Civil Society partners in the global south. A shout‑out to them for allowing us to partner to meet those most affected. We're out of time, Peggy, so I'll leave it at that if that's OK.

>> PEGGY HICKS: Great. I realize we didn't have a chance to go in depth on some of the comments made. But obviously the door is open in terms of the ongoing conversation with the team. I know you are all waiting with bated breath for that fourth draft, as I am as well. But obviously, as Scott outlined, the timetable at the beginning.

We're really hoping to see movement on this in the first part of 2023, and then taking in mind Peter's earlier comments about the need for this to be in place decades ago, really hoping that, and taking in mind the tech envoy's comments about the fact that we will be a continual process of learning as we move forward on this as well.

Looking forward to moving the UN forward on human rights due diligence in the coming years. So thank you all for your active listening and engagement on this issue and looking forward to more in the future. Thanks again.