IGF 2025 – Day 4 – Workshop Room 5 – WS #190 Judging in the Digital Age Cybersecurity & Digital Evidence (RAW)

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

(Music playing)

>> NAZARIUS KIRAMA: Thank you so much and welcome to this session, Judging in the Digital Age: Cybersecurity & Digital Evidence.

We're on channel 5. If you would take your equipment and turn it on, put it on channel 5.

Today, we have a number of speakers from various jurisdictions, in terms of our IGF segments.

Can I have the slide, please? Thank you.

My name is Dr. Nazarius Kirama from Tanzania. I work with Internet Society Tanzania Chapter. I also am is coordinator for the Tanzania Internet Governance Forum.

Today, we are going to have a very good session on judging in the digital age, cybersecurity, and digital evidence.

Why are we here? Because courts are digital. Digital evidence is central to more cases, mobile data, emails, metadata, surveillance footage, blockchain logs, and AI‑generated content.

We have cyberattacks on structure that can threat technician data integrate and access to justice. In that case, we must talk about judges accessing tech across legal dilemmas.

We've been working since 2023, 2022 to bring the judges to the Internet space.

In 2023, in Japan, we had a session Called Judges on Digital Rights Online.

The goal is to link with can cybersecurity experts and policymakers.

There is a platform for dialogue. We are shaping the culture for the digital age.

The digital evidence and cyberthreats, you will find there are so many things at stake. How do we verify files across different legal systems. That's question number one.

Digital custody, what is done to make sure evidence remains untainted.

How do we protect institutions? That's question number three.

AI and justice, can we trust machine learning in evidence analysis or sentencing? Question number four.

Judges need continue specialised training to keep up with emerging tech like AI.

The next slide is about our vision. We need to continue to be resilient, digital literate to have those in the judicial system.

We need to have harmonisation to handle things uniformly.

Share practices. Civil, common, and hybrid traditions and also develop capacity building programmes, cyberlaw data protection, digital forensics and digital literacy.

Civil society, tech developers, and empower courts to not just catch up but lead in shaping responsible digital justice.

 

I will welcome Eliamani Isaya Laltaika, from the High Court of Tanzania. Honorable Eliamani, as a High Court judge, how are your courts ‑‑ of evidence, especially when such evidence is from outside your jurisdiction or lacks clear standard for authentication?

>> ELIAMANI ISAYA LALTAIKA: Thank you very much, Dr. Nazarius.

First and foremost, my appreciation to the IGF secretariat.

Before I answer your question, I would like to unpack some of these concepts. From a legal point of view, cybersecurity, this process, legal policy, social, and even diplomatic processes to keep the cyberspace safe for all users, including children, people with disabilities, across regions. That's the whole concept of cybersecurity, to ensure the cyberspace is safe for all of us to use.

This is now the information with value presented to a court for a judge to consider in making a decision, whether something has happened or has not.

Digital evidence, or electronic evidence, is a newcomer in the development of law and the judiciaries all over the world. Before that, only hard copies were used to prove something happened or not.

When courts consider whether to admit evidence or not, there are usually five considerations.

This doesn't distinguish whether that piece of evidence is from one jurisdiction or another country.

I will ask or use my own judgment to determine if something is relevant to a case I'm addressing.

Authenticity, if you're telling me this is a video of someone using a knife on a passer‑by, I should be able to know that's exactly what is being done, and it's not a cartoon.

Verifiable system, I should be able to verify a system from which that video was extracted or that email was prescripted out.

Number four, chain of custody. I should know who took care of that piece of evidence. How many hands did it change through before it came to my court?

Finally, and this is a little bit technical, I would check whether it complies with the statutory requirements of the evidence act of my own country. Every country has its own legal system, has its own precedent, has its own way of judging evidence.

So if a piece of evidence passes that process, then there is no discrimination, really, whether it is from my jurisdiction or not. And I would only say that people think this is only from the movie, but I can say these are actually things that are happening, and each one of you is currently creating digital evidence, or electronic evidence, from the pictures you are taking, from your geolocation, from the voices you are sending over WhatsApp. Meta tags can be used to show what someone did. Everything you're doing from shopping online to walking into a casino, that is actually building some sort of digital evidence ecosystem.

What does this mean in practice? It means that cybersecurity law is much, much beyond what many people consider criminal.

Law, as I said last year ‑‑ no, three years ago, in Japan, is not only about punishing people. There are so many rules of law, and I will conclude with this. Law can play a punitive role. So and so has done something wrong and must be punished.

Law can play a facilitative role. Law is facilitate you to require digital assets. No one is punishing you. You're just qualified to get some rights, and the law is there to say, yes, give this person a license or clear this person from whatever defamation.

And the third rule of law is to cooperation with other authorities. The law is there to assure there are bridges. When you want to consult a government entity, there are legal procedures that must be followed. And if they refuse to do something, you can still come to court and say, I want you to issue an order to command the police officers to give me a clearance for me to be able to travel.

The judge will say, I want to command you to do your function in enabling this person to do what they want.

So we want people to have this broader look at law rather than just thinking a judge or a police officer is there to wait for you to be punished.

Thank you very much.

>> NAZARIUS KIRAMA: Thank you, Honorable Judge. I have a request to make. If you could issue an order right now for every speaker to stay within five minutes for each submission, I think I would appreciate that.

(Laughter)

>> NAZARIUS KIRAMA: Now I move to Professor Peter Swire, are you there. Can you introduce yourself.

>> Peter SWIRE: I don't know if the video works. Oh, there is. My name is Peter Swire. I work on these issues as the leader of the Cross‑border Data Forum where we do a lot of work regarding going across borders, including ‑‑ access.

>> NAZARIUS KIRAMA: What are the key principles or methodologies that judges and lawyers must understand to critically assess their liability and chain of custody for digital evidence, especially when it is presented through automated or AI‑generated tools?

You have five minutes.

>> Peter: Yes. Thank you for having me here. I'm teaching in Spain, and I'm happy to be on this panel.

We have data at the forum that talks about government access to data across borders, such as the Budapest Convention. We've written about how conventions in Africa might be useful to have government get access to requests in other countries. Without that access, the United States has a blocking statute, and it's hard to get the email communications for the judges.

So that is background.

I would like to emphasise the three areas where things are different. First, I will tell you how much the digital issues are the same.

Listening to the Distinguished Judge right now, the principles are the same as those that existed before the Internet happened, to a certain extent.

You've always faced the problem that maybe this piece of paper has a fake signature on it. Now it may be a fake document electronically. It's been a problem for judges forever about whether to believe the evidence that comes into court.

The first is authentication. Somebody may say they are writing from a police agency or a prosecutor's office, but, in fact, they're faking it. They might be from some other place. And this was mentioned by the judge.

And so the first thing to trust in evidence is that you're dealing with the right party, the right person sending you the data.

And in a world now where passwords can be broken many times, the standard, good technology is to have what is called two‑factored authentication. Many of you have used this where you log in with a password, and then they send you a code, and you send a code. That's much harder to fake than simply a password‑based system.

So that's the first thing, some confidence you're dealing with the right people for authentication.

The second question is chain of custody and whether you believe the document that came are the Alice is the same document that's received by Bob. And we have well‑established procedures, what are called digital signatures, and the basic idea is Alice sends a document, and they do a mathematical operation on it called a "hash," and it's unique number merges on the far end. If one word is changed, the hash of the document changes.

So these digital signatures are to prove what left from Alice is the same received by Bob. So the issue of digital signatures are very important.

The third question that's come up more recently is what about AI? We know that AI can have hallucinations. We've seen law cases in the United States where a lawyer put into the question into the AI system and got back case citations that were not true. They made them up because AI and large language model use predictive technology, not actual technology. When you receive documents that are generated by AI or might be generated by AI, should you believe the citations? There's no real answer but double‑check the citations. If you have time, double‑check all of the citations. Go to the link in the page and make sure it says what it says it says.

We did that and we had to check to make sure the lawyers were giving the correct citations.

Maybe you do a sample. Try 10 or 50 or whatever the number is and start to see if you have any fake citations come in.

I think what I'm emphasising for today is, in many ways, the problems are the same that judges have had forever, but we have to be sure about authentication. Is this really the person? We have to have assurances on chain of custody and signatures. We have to check the AI sources because, otherwise, it may be a fake citation that you don't trust.

So I will stop there. Thank you very much.

>> NAZARIUS KIRAMA: Thank you, Professor. Now we have learned that there's a predictive and definitive citation. That's why we're bringing the whole court system into the IGF.

I want to go to Dr. Jacqueline.

>> Jacqueline: My name is Jacqueline Pigato. It's a pleasure to be here. Even though I'm not a lawyer, I will try my best to answer your questions.

>> NAZARIUS KIRAMA: We're a multistakeholder body of the United Nations, the IGF.

Now, I received the order from the judge for every speaker to stay within five minutes.

Thank you, Professor, for staying within five minutes. It was actually around 3.43 minutes.

Now, Dr. Jacqueline, with the rise of things like spyware, the state surveillance tools being used in the name of national security, how should the judiciaries look at these things used together with evidence? What roles do courts play in civil rights when navigating methods that may be legally or ethically contested?

>> Jacqueline: Thank you. I think I'm going to bring the concept of spyware for those not familiar. I think most of us know by now that spyware refers to surveillance technologies used to secretly extract data from devices without knowledge and minimal legal oversight.

These tools are increasingly being used in ways that erodes democratic institutions and legal protections.

So they're undetectable nature means that violations often occur in the shadows beyond the reach of public scrutiny or legal remedy.

As highlighted by one of our projects in Brazil, it violates only privacy and due process and the separation of powers.

In Brazil, we have the (?) Case that has this threat. A spyware tool was deployed by intelligence officials to surveil targets that ranged from activists to Supreme Court justices themselves.

So it is a pragmatic case that stresses the features of the cybersurveillance, but it's only one example in a broader context of lack of oversight.

This reveal as structural problem. When surveillance happens outside frameworks, courts are sidelined and unable to guaranteed fundamental rights.

In the Brazilian context of use of spyware by the state, there is a Supreme Court case pending in which the regulatory gap that allows for the current state of things is being challenged as unconstitutional.

In this case, we argue that the use of different spyware for surveillance by the state should be ruled unconstitutional since even impossible content of criminal prosecution, law enforcement, the nature of how they work moves beyond ‑‑ making advantage of vulnerables found and resulting in a level of intrusion that's difficult to justify under the democratic parameters.

However, even if the entire system is not ruled unconstitutional, we are requesting that strict criteria be established for the use of spyware.

Particularly, the use of prior authorisation and adherence to similar strictness in other situation of confidentiality breach, the interpretation of communication confidentiality updated to contemporary standards of intrusiveness, the inclusion of mechanisms to respect the change of custody ‑‑ and other compatible with the constitutional order.

So I will stop here now. Thank you.

>> NAZARIUS KIRAMA: Thank you, Dr. Jacqueline. That was very informative. I'm very glad you could do this submission.

Now I go to advocate Omar ‑‑ from Pakistan.

>> Omar: Thank you. This is Omar Hahn (phonetic) from Pakistan. I'm a lawyer working on the digital rights in Pakistan. It's new. We're having our first ‑‑

>> NAZARIUS KIRAMA: Thank you so much, Omar Hahn, from the Advocate Court in Pakistan.

Given that you have vast experience as digital rights and defence lawyer, how do you see the balance between state surveillance for cybersecurity and the individual's right to a fair trial, particularly when digital forensics are used to prosecute cybercrime.

>> Omar: To prosecute a digital crime or cybercrime, it's important. Digital crime is the government or the state looking after the general public. In the digital world, when the world has become ‑‑ on one cluck or certain issues when the general masses are arising at the same time.

It's key. Without it, you cannot go without the Internet. Everybody will be doing their own job. Everybody can do a crime.

So it's to get data.

How they look at the principle of legality, professionality, and the transparency.

The surveillance, the state agencies are doing, whether it is protecting the rights of the people.

Along with this, is this not violating the right to privacy, the right to dignity? These are the human rights given to the citizens by the law or the constitution or the declaration of human rights.

I believe all these things are important, but how the data collected from the end user are protected. We have seen in the world that, on the state level, the data of the user has been shared. I think this is important.

The second one is the digital evidence forensic. By end of day, if a crime has been committed, it is the evidence that has to prove whether the crime has been committed or not.

The way the Professor and the Honorable Judge has mentioned that's very important. The evidence collected, is it following the SOPs, whether the forensic, the legality of the evidence? When it's happening, you're prosecuting a crime, you're collecting evidence, it has to follow the standard of digital forensic in a way that has been proved because perjury is easy now. There's tools that create hurdles for the people.

It is the state to ensure the standard of the digital processes in a way that a crime has committed should be prosecuted in a way that the right to a fair trial has not been violated, which is very important.

This is from my side, Dr. Nazarius.

>> NAZARIUS KIRAMA: Thank you. I appreciate your intervention.

Now I go to our policy researcher. I would like to ask you to use the next moment to introduce yourself, and that will be followed by a question.

>> Hi, everyone. I'm a senior associate at Idea For Change, an India‑based organisation working for social justice.

Very happy to be here and share the space with the esteemed panellists.

>> NAZARIUS KIRAMA: Thank you so much, Marianne (phonetic).

From feminist to legal and policy perspective, how can judicial systems be better equipped to handle cases of online gender‑based violence and harms when evidence is in algorithms and transnational digital ecosystems?

>> Marianne: Thank you. I would like to share some insights from research that Idea For Change, the organisation I work with took approaches to look at gender‑based violence cases in India and the challenges that are commonly encountered in prosecuting such cases, especially the digital evidentiary issues.

Gender‑based violence is a criminal offense in India under Penal Code.

You have to prove the person guilty beyond a reasonable doubt.

In gender‑based violence, there were difficulties in bringing in expert testimony, relying on witnesses, and ensuring the validity of digital evidence.

Evidence will be admissible under two conditions. If the original computer source in which the evidence is recorded is produced. In this case, it's primary evidence. Or if the code are produced ‑‑ in which case, we need a certificate for authentication.

In our research study, we found that, in many cases, the code tended to dismiss the evidence because of lack of authentication certificate.

The issue is it's very difficult to sometimes obtain the authentication certificate, especially if there's no access to the resource.

In many case, they may have applied to obtain the certificate or it's not issued or they may not be aware of how to get the authentication certification.

In such cases, the court issued a dismissal because of concerns of authenticity.

There's a depriving of access.

Even if the prosecution fails to submit digital evidence and the main barrier comes from the lack of corporation from the digital platforms, like social media platforms, in responding to requests from law enforcement agencies to provide information.

Despite asking for information from the telecom services providers, sometimes there's a delay.

In many cases, the evidence and other materials and devices have to be submitted to the state, and there have been accusations of leaking. There's issues with preserving the chain of custody.

I also wanted to talk about an observation from the study. In many cases, we found that sometimes courts treat things less serious because there's an assumption that online violence is trivial and doesn't involve injuries.

In such cases, the court tends to focus on the physical harm and ignore the online aspect of the crime.

And so that's one of the main findings of our study.

Similarly, in cases involving online violence, especially on digital platforms, there's a need to recognise a crucial role of platforms and the roles they play in amplifying the harm.

Sometimes the court tends to not look into the accountability of social media platforms or to issue orders in a timely manner to take down the content.

In terms of how we can better ‑‑

>> NAZARIUS KIRAMA: You have one minute.

>> Marianne: Rules and procedures, as the previous panels have said, varies from jurisdiction to jurisdiction. But it would be good if the digital evidence are not dismissed merely on procedural grounds. If the court can use the inherent power to summon ‑‑ the court recognises the challenges and ‑‑ courts should look into the platforms for harmful content, this includes ‑‑ their role in the often in question or issuing suitable orders to take down the content.

Finally, it's very important to recognise the online public sphere and to go beyond the traditional understanding of harm that's focused on physical injury. Yeah. Sorry.

>> NAZARIUS KIRAMA: Thank you, Marianne, for the very elaborate work that you're doing.

I have an apology from ‑‑ from the High Court of Ghana. She was supposed to join us online, but she got an emergency at the last minute and was not able to join.

So apology from Honorable ‑‑ from the High Court of Ghana.

Professor, are you still online? Peter Swire.

>> PETER SWIRE: Yes, I am. I just had to unmute.

>> NAZARIUS KIRAMA: If you think of these issues and were to come up with one red flag, judges should look for digital forensic report, what would that be?

>> PETER SWIRE: Well, I had not prepared that question. I thought you might ask of one piece of advice to judicial systems that they would do, and I want to mention the problem of ransomware, which is the possibility that a bad actor will try to lock up the files of a court system so that the judges and the courts lose access to the files.

When I teach my cybersecurity class, I say to people, for ransomware, the important thing is to have offline backups, if possible, and you have to protect the backup systems, too, because the bad guys try to get into the backup systems.

We've seen courts get hit with these ransomware attacks, and having a backup, to get things back as they were yesterday, that's important to have in place.

You asked about red flags. I think the thing I would worry about is whether somebody on the other end of the line is really who they say they are. Right? We know in our personal lives that we may think we're talking to somebody online, and it's somebody else.

So having a channel to communicate with them and a second channel to make sure they're really who they say they are, that kind of two‑factor thing is really important because, otherwise, you may be getting evidence from somebody who is not even the right person.

>> NAZARIUS KIRAMA: Thank you, Professor.

Now I go to Dr. Jacqueline. I know you are not a lawyer, but we are all, you know, in some way or somehow, will be ending up in court.

What would be your suggestion or recommendations on spyware? If you could, spend one minute on that.

>> Jacqueline: Thank you. In Brazil, we are developing key recommendations on the research. Just to clarify, I spoke about the Brazilian case, but this is not exclusive to Brazil. We have cases in Colombia. We have an important presence in the U.S. with the Pegasus and the damages to Meta.

Let me say some recommendations we're working on. I think we have four key recommendations, I would say.

The first one is to develop technical and legal standards for the judicial chain of custody, including metadata preservation, access logs, authentication layers, and independent audit trails.

The second is train judges and legal professionals in cybersecurity, forensics, and especially data protection.

The digital knowledge gap we think is no longer a risk we can afford in this scenario.

The second would be to equip courts with cybersecurity proposals and contingency plans to strengthen resilience against cyberthreats, including unauthorised access to judicial data.

And last but not least, of course, promote multistakeholder dialogue among courts, technologists, civil society, and policymakers.

I think digital systems must evolve collaboratively to meet these realities we are talking about here.

So thank you.

>> NAZARIUS KIRAMA: Thank you so much.

Marianne, can you share one way courts can be responsive to survivors of online violence? If you can present it in one minute, it would be clear.

>> Marianne: Sure. I will try. As I said in my previous intervention, it's very important to understand the publicsphere itself. The unique things people face, also the roles of the platforms and the amplifying of harms.

Secondly, it's really important for the courts to uphold the right to privacy of the survivors, in the case of online violence, and ‑‑ it's also very important to conduct periodic training for lawyers and judges, legal fraternity, because, as the technology changes, you need to be updated about the kind of harms that it poses, especially to communities like vulnerable groups.

For instance, our organisation has developed a resource guide for judges on how they can use the existing laws to successfully prosecute online gender‑based violence cases and by respecting the rights of the survivors.

>> NAZARIUS KIRAMA: Thank you, Marianne.

Now I go to the judge of the High Court. How do you see the regional ‑‑ helping the issues?

>> ELIAMANI ISAYA LALTAIKA: First, the Internet is borderless. The virtual space doesn't respect the traditional boundaries that we have known.

Secondly, most of the progressive cybercrime and cybersecurity laws have jurisdiction.

In Tanzania, we do not have jurisdiction only against the people who commit these offenses while in Tanzania.

The law ‑‑ I think this is Section 4 Subsection 2 empowers courts to deal with anyone in the world who attacks a citizen in Tanzania or anyone using a computer system when in Tanzania. The law will catch up with you. To be able to exchange the data and get extra territorial expertise, one must be able the communicate.

Get into the 21st century with protecting citizens.

>> NAZARIUS KIRAMA: From the legal perspective, the civil society, what do you think should be the legal safeguards that are most urgent to protect defendants in digital crime cases?

>> There's innocent until proven guilty. So a person is innocent until he's proven guilty.

There are certain challenges which are often faced by the defendants in the digital crimes. One is updating out dated information.

A law passed in 2015 cannot be brought in 2025 because crimes are changing every single day.

I think the legislation kneads to be properly updated, according to the needs of the digital world.

Along with that, proper legal procedure, there should be a procedure to follow up the accused who is the defendant. His rights should be ensured. He should be given a chance of trial.

There should be a proper way of warrant, seizing his property, if this is involved in the digital crime.

Like, in Pakistan, unfortunately, I will mention, somebody has done a crime. The entity responsible for handling the digital crime will rush ‑‑tic it's important to ensure the independent protection.

One more which is very important is the judges are trying to understand the digital forensics and the cyberlaws.

I mentioned that in the whole district, the capital of my province, there are one or two judges who are handling the case of cybercrime.

I believe there should be provision of speedy justice, speedy system to decide what a case is, and the judges should know really about the forensic ‑‑ the cyberlaws, and this should be included in the judge's training. This is what I believe.

>> NAZARIUS KIRAMA: Thank you very much. As the speaker prepares, I have received a note from the High Court of Tanzania to open the Q&A for the audience.

Before we continue, if you have any question, you can raise your hand. Before that, we have a question from online, if you can read that.

Then we have a gentleman from Congo DRC. After the online question, you can come in for you question.

There is a lady over there. You will be second.

I also saw a hand there in the back. All right. Let's go.

>> I'm the online moderator.

This question is ‑‑ in Ecuador ‑‑ I don't know if it would be contempt of court.

There's a case in Ecuador since 2019. His case ‑‑ misuse of digital evidence in judicial proceedings.

In his trial, a simple photograph was used, which showed a connection from an unverified user to an IP to support an alleged attempt to gain unauthorised access to a state telecommunications system.

It's been said that access is not proof of a digital crime.

So what other protocols must be in place to ensure that alleged digitised evidence is not used to maliciously prosecute individuals.

>> ELIAMANI ISAYA LALTAIKA: I think we're usually not allowed to talk about any case in the court, and we will refrain from doing so.

>> From a journal perspective ‑‑

>> NAZARIUS KIRAMA: But there is an order from the ‑‑

>> Thank you, My Lord. Thank you.

(Laughter)

>> NAZARIUS KIRAMA: The case is still active?

I think we'll refrain from answering.

You just prepare.

We have a gentleman from the DRC Congo.

 

>> FLOOR: Thank you. I'm a Congoese magistrate.

What recommendations does the panel have for the restraints and environment to critically assess digital evidence presented to them and how to mitigate the most pressing cybersecurity vulnerability within the courts in the operation.

Thank you.

>> NAZARIUS KIRAMA: Any of the panellists? Anyone?

>> PETER SWIRE: Maybe I can. There's many good possible steps. One good thing is to have backup so you don't lose the court record. That's what I said about ransomware.

Digital storage is relatively inexpensive. If you lose your records, you have a difficult chance.

If you have storage ‑‑ you have the ability to have a fair trial.

>> ELIAMANI ISAYA LALTAIKA: The judiciary does not operate in silo.

The standard of security that applies to records of Parliament applies to the court as well.

 

>> NAZARIUS KIRAMA: Thank you, judge.

I saw a hand. And then there's another hand over here, if you can be on the mic.

Do we have to take all the questions and then respond at once?

>> As they come.

>> NAZARIUS KIRAMA: As they come. Okay.

Yes?

>> Hi. I'm a professor in Colombia. I would like to raise an additional issue. A moment before the digital evidence, the contact information and notification, they're an American data protection network recently published an open letter to the companies accountable on data processing.

It's an open invitation directed to the companies that process personal data and are not established in the state.

First, assign a representative for data protection before the national authority and to provide contact details and empower them with judicial and administrative representation.

Second, to provide agile and effect ‑‑ data owners to receive notification of a judicial process or administrative investigations ‑‑

>> NAZARIUS KIRAMA: We don't have time. Can you ‑‑

>> It's to multinational companies. Not necessarily regarding data protection, but you may have some insights on how to deal with it.

>> NAZARIUS KIRAMA: You can share that with us through email so we can put that through our channels of communication.

>> FLOOR: My question is for Omar Hahn about the recent amendments in Pakistan, that it is all about to address the cyberharassment laws. Do you think it really serves its purpose? It's ‑‑ freedom of expression in Pakistan.

>> Omar Hahn: If the government thinks fake news has been shared, a person can be jailed for five years and fined ‑‑ rupees.

I believe, yes, somehow I have been from the legal background. I have concern on that. I believe there's major news to be fake or false. When someone is critising the government, is that a false fact for the government. Whether he will be tried or prosecuted?

I believe there's concern for that. The human rights defender, the digital rights defender, and I believe the courts should look into that matter.

This is somehow a violation of the fundamental right.

>> NAZARIUS KIRAMA: Thank you so much.

Next, please.

>> FLOOR: Good morning. My question is how do we save the dignity and privacy for people with disabilities.

How can courts protect medical information shared during hearings or ‑‑ sensitive testimony.

>> ELIAMANI ISAYA LALTAIKA: With data protection laws, there are ways that judges are instructed to ensure ‑‑ hearing. So I do not have to go into the open court for a sensitive case involving someone who wants to protect their privacy.

Also, there are mechanisms to anonymise information to make sure no real names are used and still arrive to justice but without unnecessity exposure. So everyone is catered for.

>> PETER SWIRE: Can I add just briefly on that?

>> NAZARIUS KIRAMA: Yes. Go ahead, Professor.

>> PETER SWIRE: In the states, we have HIPAA, which I worked on when we created it, and it has a mechanism for what they call qualified protective orders. One is only the judge looks at the sensitive evidence, in camera, just the judge.

The next is they allow only both parties to see it, and the judge create as protective order. You can find it online, if you look for it, for qualified protective orders.

>> NAZARIUS KIRAMA: Thank you very much. Is there any question online?

Thank you so much.

Is there anybody?

>> There was a lady with a question.

>> ELIAMANI ISAYA LALTAIKA: That was not responded. I just want to say to the professor from Colombia. There are currently ongoing UN mechanisms to ensure that country laws are not too restrictive. So there are diplomatic processes.

We have yet to verify all these protocols, but as soon as they are ratified, it will be ‑‑

>> NAZARIUS KIRAMA: I will begin with Dr. Jacqueline. Today, we have participated in Judging in the Digital Age: Cybersecurity & Digital Evidence. If you look at the way you have interacted with the audience, the questions that they have brought to the panellists, what would you share today?

>> Jacqueline: I'm going to back to Brazil because it's the context that I know. In Brazil, we have the right to data protection, as a constitutional right. This happened in 2022. I think it was a great victory. We also have a comprehensive law LGPD. We still don't have a criminal data protection framework. So I think that's an important gap to address. I think all of the questions and discussions that we raised here today could share this concern of having this framework to fight against privacy‑related crimes and the weaknesses of the digital system, their ability to handle cybercrime effectively.

So this gap in Brazil has been identified in some legislative reform efforts, including the reform of the Brazilian Code For Criminal Procedure, but no law has been approved so far. So I think that's one development that we still have to fight for.

>> NAZARIUS KIRAMA: Thank you, Dr. Jacqueline.

Marianne, the same question goes to you also.

Just, in one minute, if you can give your ‑‑ what you would like to be seen, for example, in the future.

>> Marianne: Speaking of the world I work in, the online gender‑based violence, I think all should be safe spaces for women and other survivors of online violence that are seeking justice.

It's very important to have inclusive policies and ‑‑ to deal with this in a respective manner. The current realities need to be presented.

Changes at the legislature level are crucial for things to work in this case.

>> NAZARIUS KIRAMA: Thank you.

Professor Peter?

>> PETER SWIRE: Thank you very much. And thank you for having able to participate in this very well‑organised session today.

The data forum we created several years ago, it has stated goals on the website. One is that the government should get access to data when there's a need. Maybe there's a crime and a warrant with a judicial court order.

On the other hand, we need privacy and data protection happen when these requests come in. Imagine if you're a company or a court and there's a request from another country and you don't know the practices there. Maybe it's a request for data to stop political dissent. Maybe it's not following data protection rules.

Our work has been to find out how do we have correct access when it's criminal and there's the right showing and how to have privacy and data protection when those rights need to be held up during the system, and how do we make it workable so the people who hold the data know what their responsibilities are and don't create laws that cross each other.

I put the links in the chat.

You have to protect people's rights, and I think that's the challenge we face very much.

>> NAZARIUS KIRAMA: Thank you, Peter.

If you were to give us advice as a judge, what would your part in short be?

>> ELIAMANI ISAYA LALTAIKA: Just to say thank you to the many people, especially the civil society fraternity who are doing a fantastic job to ensure that there's capacity building for judges.

My colleagues from India and Brazil, even from Pakistan, have highlighted that there's a knowledge gap is true. Some of us are not aware of some of these developments in the cyberspace. You can still find a court in the form and you ‑‑ we have to ensure we invite judges to these forums and have capacity building.

We will not stay in courts only to convict someone wrongfully because we don't know the law or just a bit of science.

Thank you. You have been ‑‑ in Tanzania. That's how you got me out of my chamber to travel many places. I can assure what you're doing is making a different, not only in Tanzania but across the continent and other parts of the world.

>> NAZARIUS KIRAMA: Thank you.

Dr. Omar, two minutes.

>> Omar: I believe this is for the second time I'm sitting with the people from the ‑‑ background towards the legal or judiciary trade. So I believe it should be continued by the end of the day crimes are happening, and it is the court's, the prosecutors, and the ‑‑ who are going to handle these cases.

As mentioned by the Honorable Judge, the capacity building of the judges are important. They should not just be sitting in the courts.

When we go back home, we just try to have a training for the judicial academy of my province. Unfortunately, that didn't happen because of the timing and the Ramadan.

>> NAZARIUS KIRAMA: Is the professor from Colombia still around. You have one minute for your part in short. If you have anything to say, one minute.

>> FLOOR: I would say just that there are a lot of challenges in Colombia. We have the main challenge of notification. I mentioned something about data protection investigations, but, also, in human rights, we have a specific issue on notification. So that's one of the main capacity‑building areas.

>> NAZARIUS KIRAMA: Thank you. I will keep in touch.

Thank you, ladies and gentlemen, for attending our session. We have come to the end of our session. Thank you so much for your contribution. Thank you so much for listening. One of our goals of the session is to bridge the divide between the multistakeholder and the judiciary to make sure we transform the judiciary to become the better institution and get the judicials out of their silos so we can make justice better for every single one of us.

Thank you so much.

(Applause)