The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
***
(Music playing) ¶
>> RAQUEL DACRUZLIMA: Hi. Hello, everyone. It's great to welcome you to A competition & rights approach to digital markets. Before we start, I would like to invite anyone who would like to be with us here at the roundtable, you would have mics, which makes it easier to ask questions at the end of the session.
Feel free to sit with us like André.
I would like to thank our panellists for being here, first, Paula and Hannah.
My name is Raquel daCruzLima. I'm a human rights lawyer in Brazil and I work in South America, a human rights organisation dedicated to the protection of freedom of expression.
Under the freedom of expression, diversity is vital. For that reason, human rights bodies, such as the ‑‑ Court of Human Rights, have long stated that the means by which freedom of expression are ‑‑ opinions is limited. Therefore, in order to protect freedom of expression and access to information, states have a duty to prevent excessive concentration in the media sector.
Nowadays, the concentration of power in digital markets is a growing concern, and it was even mentioned yesterday in the UN Secretary‑General address to the IGF. There are already many efforts to tackle that power, and one important example comes from the digital markets sect. So I would like to give the floor to Bruno, who is online, to help us understand a bit more what were the objectives of the DMA and how it proposes to address the concentration of power in digital markets, and if the protection of freedom of expression and human rights was one of the goals pursued by the DMA.
So, Bruno, I would appreciate if you could start by introducing yourself and talk about the DMA.
>> BRUNO CARBALLA SMICHOWSKI: Thank you for the invitation. Hello, everyone. I'm Bruno Carballa Smichowski. I'm a research officer at the European Commission Joint Research Centre it supports evidence‑based policy, including the DMA.
I'm an economist working in the digital markets research team.
So I will try to walk you through the spirit of the DMA to explain how it links to the broader issues that have been discussed today.
So perhaps a very small disclaimer about what the DMA is.
Can you hear me?
>> RAQUEL DACRUZLIMA: You are back.
>> BRUNO CARBALLA SMICHOWSKI: Perhaps a clarification that the DMA has a pure economic objective, which is precisely to reduce the market power of the so‑called gatekeepers. I will comment to which of these platforms are called gatekeepers.
But this has an effect on the platforms to abuse their power in non‑economic ways. It's more of the goal of the discussion of this forum, such as all starts of human rights violations.
That said, obviously, there are other regulations that have a specific target. They're non‑economic and have more to do with human rights. So think specifically of the DSI, Digital Services Act, which is a regulation that wishes to curb discrimination and so on.
So how does this work and what are the expectations of the new regulation.
The timeline, a recent regulation for the legal time. It enters into force in November 2022 and really became applicable in articles of May of 2023. So we're talking to years for DMA. That's quite young. We'll talk about this in a couple of minutes.
The first decisions on how companies are or are not following the roles of the DMA.
The role is to curb the power of the so‑called gatekeeper platforms. For that, it define what is the gatekeeper platforms are.
The first one has to be big platform. So not every platform on the Internet, which would be practically impossible, but those that do have a much stronger impact.
So in that sense, the first criteria is these platforms have to have at least 7.5 billion in the last three years of revenues and market cap. So they have to show they have a big economic power, in terms of size.
They have to be part of one of the so‑called core platform services. So these are services that are deemed to be particularly important to the digital space.
So it could be any type of marketplace, search engines, video‑streaming platforms like YouTube, services that are basically messaging apps, virtual assistants, web browsers, operating systems, cloud computing, and online advertising.
And this is the first list that is going to be revised and one of the discussions that is going on right now is should ChatGPT be included or does it actually fit to be in one of the categories, which is that of search engines.
So these platforms are in critical areas that are important, in terms of size and potential impact and that had been in a durable position.
So it means this criteria has been met in the last 15 years, meaning it's not just by chance that they had a lot of users seasonality‑wise, and so these platforms have been used for years.
And what is the aim of this? Why this new regulation? Well, the main reason is the existing regulation and competition law, which is aimed to sanction anticompetitive behaviors usually has for many different reasons. Certain conducts are typical of these platforms, and things come in slowly.
So before any abuse of power can take place in the economic sense of the word, try to create new roles, new obligations to these platforms so they cannot abuse their position of power.
So once the platforms are designated as gatekeepers ‑‑ and here you have the usual platforms that you have in mind.
Alphabet ‑‑ so that's the Google conglomerate. We're talking about Amazon, Apple, Meta, all the Facebook family products, and Microsoft.
So we're talking about the main platforms that have the most powers in the Internet.
So these gatekeepers that have already been designated because they meet the criteria I was talking about before have new obligation that they didn't have until two years ago. So the obligation are different ways of trying to make the platforms not abuse their power.
So the first one is they have to allow business to offer the products and services outside of the platform. So it would just be many cases where, for example, an app by a small developer or big developers have the issue that they have to go through the App Store, which takes a big cut, usually around 30%. And they cannot promote, in any way, to pay outside the platform.
So the platform abuse, the fact that it's precisely the gatekeeper between people who have the phones and want apps because the only way to reach the apps is through the store.
So extract all the value from the apps.
That ends up not benefitting consumers because then apps are going to be of lower quality. There's going to be less apps and less incentives to go into their market because it's too expensive.
The provisions about the access to data, usually the business users or the seller on Amazon usually don't have access to the data about the people they interact with, which they could use to make their services of higher quality. The new obligation is they have to give access to better compete.
Another third obligation is to allow users to uninstall preinstalled apps.
Many platforms ‑‑ the system of the phone, there's apps you cannot uninstall the preinstalled apps. Apple is an example. They put it there. You cannot take it out. So now they're obliged to let you take it out. So there's new competition. If you want to use Duck Duck Go, you can download there.
There's personal data from different platforms. So, for example, Google has a lot of different platforms with the same users so they know where you go looking for food in Maps. They know where to go with the search service. So if you don't give consent, they can use that. So the power to merge from my markets, nobody can challenge them from any market because it's very difficult to replicate the fact that few gatekeepers have multiple access.
Think of this for business users, people who use the Microsoft suite, who have the cloud and the oppressive system, they collect all the data, and it's very difficult for someone from a non‑gatekeeping platform to replicate it.
In the same spirit, another thing is ensuring interoperability. So it's a classic problem that a lot of complementers are facing.
Finally, about the ‑‑ market, there's data and pricing. That's concentrated and they control the value chain of online advertising. So, again, it's very difficult to compete with that, which, in turn, leads to higher prices for advertising and eventually higher prizes for users for anything they're advertising online.
Finally, allowing developers to use third‑party payment systems. So, in the same spirit, allowing them to do business outside the gatekeeper should be able to pay with something else, for example, Google Pay or Apple Pay.
There's an obligation not to push users to use it. For example, when you used to look for something on Google, you may find the link goes to Google Maps. Now, in Europe, it doesn't anymore because of this. Or Amazon that tries to allegedly push its own products so you end up buying the Amazon basics and not the independent sellers.
So these things are being scrutinised by the Commission.
They're there to harm consumers because of less competition ‑‑
Not allowing users to switch apps. Making it difficult from a technical point of view. If you want to change the provider ‑‑
Platforms are big. They have a lot of impact, and specific markets are critical. Making new rules, they're asymmetric.
So they cannot harm the competitors and consumers.
And where are we with this now? We're young. It's only two years. But in these two years, we have four cases open, three against Apple and one against Meta.
So, basically, against Apple, we have one against the anti‑steering ‑‑ or the anti‑self‑preferencing, the idea that the platform may benefit its own products as the gatekeeper.
Apple has been hit a fine of 500 million so far. This is all public information. You can check the decisions and the whole process.
Also, Apple has been ‑‑ has opened a case off the issue of non‑compliance with the choice screen. So to give users other options, you should give a choice screen. Do you want to open a link, do you want to use Apple or other browsers?
And the Commission found non‑compliance with the way they're implementing this because they may be trying to trick the user into using their browser because of the screen.
Another case against Apple, again, the third one, is about the specific decisions on connected app. This is a more technical one, but it's about how, basically, Apple is interrupting the interoperability. They're saying you're not making this as operable as it should be to let people add products to your ecosystem.
Finally, the last case ongoing is the one against Meta.
It's basically challenging this consent‑or‑pay model. It's the idea that you cannot use the product unless ‑‑ abusive terms and access to your personal data because, basically, they're saying, well, Meta is still not offering a free equivalent. It's basically either you use my product for free and we abuse the data we collect or you have to pay to me.
So far, Meta has been fined 200 million. This is obviously in appeal, but, as you see, in two years, we have already four cases open and more will probably be opened or scrutinised in the future. Well, hopefully, if the application is effective, we should be seeing digital markets in which the dominant platforms should have less capacity to abuse their gatekeeping power, which should benefit the consumers and there will be less power to abuse consumers in economical ways.
>> RAQUEL DACRUZLIMA: Thank you for bringing this great perspective from the DMA.
What I heard, you were quite clear that the objects were related to the economic field, but the concepts are close to human rights, such as making sure that the platforms are not abusing their power.
Also, the part about not harming consumers.
Camila, after hearing about gatekeepers, what the DMA had in mind, do you think we can consider the major communication access platforms as gatekeepers of human rights and is there a link between ‑‑ and fundamental rights?
And, Camila, if you could introduce yourself, I would appreciate it.
>> CAMILA LEITE CONTRI: Thank you. It's a pleasure to be here in this panel with you. Short answer, yes, I would go for it.
But it is a pleasure to be here representing IDEK (phonetic). It's based in Brazil and has more than 35 years of experience in protecting consumers through advocacy, campaigning, and strategic litigation, including against Big Techs.
I have a background in competition law, as well, so disclaimer.
I kind of felt isolated in both fields. You're talking about human rights in the digital sphere and both in civil society, in the human rights side talking within the language of market.
So I think that my personal ‑‑ I would say my personal goal, my personal goal is to try to connect both fields in order to answer this question and have more people breaking this barrier to understand that, yes, monopoly competition issues are key to human rights, and we should analyse them together.
But the reality is that, for example, this is the only panel in IGF that we are talking about competition or antimonopoly. I don't say this as a personal criticism, but the need that we have to discuss this more.
And I think this is a consequence that we still have this pervading narrative that in the market, we should focus on innovation and this technocratic point of view that we have from one side the economic, the market, and then we have, on the other side, the social dimension.
There's a competition professor that talks about how to reclaim the social dimension of competition law, and digital markets are key to understand this concept, and it's key to understand that there's currently the power to determine these consequences.
Power is foundational to the issues we see, not all the ones but mostly of the problems.
We currently have a society that is tech‑mediated. Our citizenship is tech‑mediated. I can personally talk about Brazil and sharing experiences on how Brazilians deal with Internet and especially lower classes. IDEK has research on how the Internet is used. In Brazil, we still have the zero‑rating practices ‑‑ people that have data caps, they mostly use the platform applications that don't spend their mobile cap. So we currently have people that have, for example, 4 gigabytes. How is this important in how debate is developed and how people express themselves. We have an issue on the discourse.
The second thing, the way we use platforms, it's beyond not being a choice that we currently have. The platforms are profiting for ‑‑ sorry. I forgot the word in English. But for political dispute. It's unfortunately how to affect freedom of expression, but, meanwhile, platforms are gaining, profiting, from that. That's concerning for me.
And the example of how power translates to other kinds of power and directly or indirectly affects human rights and it's how they interfere in the political dimension, in the political discourse.
For that, I would like to bring a concrete example about how this occurred in Brazil.
In Brazil, we're currently discussing not a DMA but the possibility of developing a new regulation ‑‑ not a regulation but a way to improve the ability to deal with digital markets. Although human rights is not embedded in there, some examples on how the DMA could be interpreted as having a good consequence in human rights could be also imported to Brazil and adapted to Brazil.
The prohibition on people deciding whether they have the ‑‑ or would have to pay for it. It's a good example.
The second thing is creating possibilities for users to choose the platforms that they choose to use could mean having platforms with moderation rules that are less restrictive and freedom of expression and could also promote other ‑‑
I will enter the concrete example in Brazil. It's the limitations on self‑preferencing.
Bruno mentioned that Google cannot ‑‑ when you search for a place ‑‑ cannot, in Europe, move directly to Google Maps because this is a way of self‑preferencing another Google service.
And in Brazil, we had an interesting case that was presented before the competition authority had that was maybe about a political person. During the week the fake news bill ‑‑ it can increase the confusion on what is true and a lie in Brazil, and this phrase directed to a blog post that said it can worsen the Internet and change the Internet for worse and when would you search for fake news bill, the first link that would appear would be a promoted sponsored link by Google saying: Know the censorship bill.
How can we have a free space of debate ‑‑ it's the only one used in practice. Is this a freeway to interact on platforms?
So trying to move more on what we can do about that.
We have this ‑‑ I believe that we have a common sense that this power is exercised in different ways, and economic power can be translated into political power, and this has consequences on human rights.
What can we do as a civil society? Empower ourselves to talk about this market language. We need to not have a prejudice on market language.
We should be empowered to have these discussions.
To I think both IDEK and Article 19 are happy to engage with other civil societies with openness.
From the competition side, from the authorities side, they have to empower themselves to understand that this translation, that the economic power is not as ‑‑ in the market. They have consequences and other rights.
In Brazil, we have a sole constitution that projects rights.
Although there's competition law, it should also respect other dispositions, including human rights.
Lastly, I think we can be more concrete, and the concrete ‑‑ one concrete proposal of this ‑‑ and I praise the Article 19 work ‑‑ there's services that Article 19 has related to claiming the tax.
Happy to continue talking. Thank you so much.
>> RAQUEL DACRUZLIMA: Thank you, Camila. This spoke to my heart because I'm a human rights lawyer. That's my only background. For me, now, everything is new. Discussing things like antitrust.
What is really powerful about being at IGF is bringing people together from different sectors, the opportunity to talk to civil and state. This is exactly what we need. And I don't think we need to go back and forget our backgrounds but exactly put them together and make it more powerful.
I think something you said was quite important, the idea that business and other authorities, they are all obliged by the constitution. In the human rights field, especially in the American system, we have long discussed the duty of control of conventionality by every member of the state.
So whatever their conduct is, they have the duty to take into consideration the international treaties that were ratified by the states.
So why are human rights not taken into consideration when competition is discussed and also when other actors are specially in this field, as said, when tech is mediating ‑‑ we from human rights backgrounds also need to learn more from business, from competition law, and so on.
With that, I would like to turn to you, Hannah because, as Camila said, there are discussions in the private sector that have impact on our rights, and I think now a great experience to share with us with what we can expect in an environment with more competition, as Bruno brought, and what kind of business opportunities that are there to emerge and how those opportunities may take us to business that are more aligned to human rights goals and standards and also play a significant ‑‑ please introduce yourself.
>> We want ethical, controllable, and accessible, we are specialising in the entertainment sector, working with Sky, Global TV 5, et cetera, and I have a background in consultancy ‑‑ as you were saying, indeed, private companies, whether solution providers like we are but also traditional media outlets or even social media platforms while pressuring, of course, profit and protecting their own interest. Also, there's a responsibility not only to respect the law but also setting the standards for transparent AI, especially in media entertainment.
Influence goes beyond operations. They help shape ‑‑ they're very technology ‑‑ many people engage with culture information every day. So it's a way those companies, including us, the way of doing business affects all rights as civil society to information, free speech, media freedom, and privacy. We are discussing human rights today.
There's a distribution of information without ‑‑ as we know, there's circulation through social media.
So presentation of this content is governed by algorithms that remains accessible to users. Bot collecting private data with no or little regard for actual transparency or contextual understanding.
So this contextual framing, it contributes to the spread of misinformation.
When there's very well‑defined context, like newspapers or reading "The New York Times" or watching the BBC, there's assumptions about style, tone, political orientation, which we know is a word we can now context.
But today, many young users, the information comes through with feeds ‑‑ media becomes more anonymous. The user is exposed but not oriented by any store aligned.
Here there's an effect on life and discourse. There's pressures and profitability. It undermines the position and the capacity to provide quality information and many of the traditional media outlets that we are navigating towards today.
There's financial difficulties being faced because everyone in the advertiser's budget are going towards individuals. By individuals, I mean influencers, and the economy has become a force.
There are blurred lines, and I think the shift raises concern about accessible information and ‑‑ social media have today. So despite, of course, introduction advice and regulations is designed to address these issues.
So this is a question that we might raise today, should the business models be ‑‑ human rights. There's more difficulties and preservings are relevant. Even if this information is not mitigated ‑‑ new voices emerge. It's less dominant. There's creators that put in a lot of work with lower entry barriers with a distinction of influence ‑‑ there's a difference between professionals that are trained to verify information.
GenAI will add non‑verified content to the massive ocean of content we have today.
So to summarise, facing this evident of content, it becomes difficult to imagine new ‑‑ and designed to restore clarity and control to users.
First, for the traditional media outlets, calling that in a position to social media, when it comes to that, what should be suggested is we should push the ‑‑ platform and sovereign algorithm. When it comes to access to information, one strategy is to support or develop indefinite platform that blend algorithms, curation, and editorial supervision.
Basically, what we've been noticing is when you explain to the end user why the content that they are watching, reading has an explanation behind it, and they trust it more.
The relationship is more open.
It could be indicated that a certain content is relevant because it addresses teams presented into previously used content. It's just an example.
GenAI now makes it even easier to extract certain team and produce descriptive tags at scale, and this support supports informed mitigation.
Even a platform owned by Google, YouTube, since 2023, started experimenting with content and labeled more intelligible.
There's a respect to privacy revelation and GDPR. Not using demographic data like age and location and building a personal strategy ‑‑ and preserving a good UX. I think it's indispensable in fighting Big Tech.
So I don't have much time. I will conclude. There are other important aspect, rethinking the way the organisation worked today.
This is the only media today ‑‑ those are the kind of media outlets going forward. For us, there's more and more opportunities that are open against pluralism, but in order for a new business model to merge, our technology should not serve as a tool for the race for visibility. There's necessary tools that exist. It's a true challenge. It's conceptual. It's always choosing to prioritise and understanding the noise and overreach.
>> RAQUEL DACRUZLIMA: Perfect. That's so great and powerful. Choosing to prioritise is an idea to keep in mind.
Also, something else you mentioned, Hannah, I think it's not only important to digital markets but to the whole media in general. I heard a lot yesterday and the day before about trust. I think when it's mentioned that the users, the logic must be ex‑complained to them what is there, at least in Brazil. That also applies to traditional media because, often, the positions of the traditional news outlets are not clear. They also don't make transparent choices.
I think transparency is always the occasion of building trust and allowing freedom of expression and access to information.
Right now, we should have our fourth panellists, but he couldn't join us. So I will open the floor for any questions or interventions you would like to make. We have around 12 minutes. So it's actually quite good time to hear you online and also here. You can talk from the mic or come to the roundtable.
Please introduce yourselves when you're making your question.
>> Laura: I'm Laura. I'm from the youth programme. I'm from Brazil too. I love what we discussed here. Your panel was amazing. But I wanted to know, in a competition scenario, how can the Global South increase the protagonist when we don't have the ‑‑ to have our own means? Like, you have a monopolisation from Google and Meta. We can start our own social media, our own platforms. Google is the main used. How can we do this?
>> I'm from the youth organisation.
Speaking about the DMA in the European Union, we see changes in the iOS in general. When we look at the European Union, we see ‑‑ is created.
How to overcome obstacles regarding incentives to users? Because although alternatives might be available, how can incentives to use Big Tech services can be overcome in a context where it's sometimes easier to use Big Tech services or platforms? Although they tried to change the institutional arrangement, but how can people feel incentives to not use Big Tech's platform and services?
Thank you.
>> RAQUEL DACRUZLIMA: Thank you. You can go.
>> Jacques: My name is Jacques ‑‑. I'm from the business side, but I'm also a teaching deregulation at a Dutch University. My question is primarily to the first speaker who very good deliberated about the digital market ‑‑ from the EU. What we see, of course, from Europe are fragmented local markets. The DMA addresses European‑wide big platforms. But what about for local champions? Then the question of how is Brazil handling local chance. Are there just really nation by platforms?
>> RAQUEL DACRUZLIMA: Thank you.
>> FLOOR: Hi. My name is Beatriz, also from Brazil. I'm currently assistant professor of law in the UK. One of the things I teach is Internet law regulation, platform regulation. I learned a lot from the panel. I think it was very good to hear from kind of more of a civil society perspective as well, kind of the need to empower organisations to join the conversation. Like human rights organisations, people involved in the governance more broadly and in the ‑‑ aspects of regulation.
I'm curious to hear from the panel and Bruno. But there's also study of that. What of the perspective of regulators joining up and kind of engaging on a conversation, a more holistic conversation, about how to regulatory the platform and not only from this market or economic perspective but also how the human rights could more broadly inform that.
I would suspect. I would say I think there are some lessons to be learned about how competition authorities engage with data protection and the GDPR and at least in relevant European case law about how considerations of data protection could inform and delimit the barrier between what is acceptable and what become ‑‑ behaviour.
Do you see trust and safety affected? Transparency? For example, the GSA being used to help draw the bond risk, in terms of an abuse of dominance in competition markets. That's one part of the question.
I mean, more broadly, this helps to count narratives that we see, as a conflict between the two, when digital markets in the U.S. was being proposed, there was debate among several academics that breaks up the publicsphere introduces more players would be harder to control, in terms of hate speech or platform regulation more broadly.
So there was the street about how to hold accountable. It's not an easy one to tackle. But I would say it's important for regulators to have the perspective of how things are joined together.
So, yeah, I'm curious to hear from you. How do you see that?
>> RAQUEL DACRUZLIMA: Great. I don't think we have other questions. So I would just answer that before I give the floor back to our panellists.
The first question I will add, for all of you, if you see any priorities, in terms of regulations now, to increase participation and a market more respective of human rights.
And the second question, I think it was directed to Hannah and Bruno. It was about advertising. I would like to know, from a European perspective, if you see any changes already? We also have concentration in the market of advertisement. Do you see any changes in breaking into the market of advertising and making it more aligned to human rights?
I think besides Bruno, we have around seven minutes for each of you to answer the questions and also make your closing remarks.
So Bruno, you can start, please.
>> BRUNO CARBALLA SMICHOWSKI: Thank you very much. Well, lots of good questions. I'm going to try to squeeze in answers in a short time. Again, this will be my own personal opinion, not in a commission official one.
With the first question about alternatives, my personal view is there's no magic, one‑size‑fits‑all solution to this. Especially for countries like Brazil, I am, myself, Argentinian as well. So I understand where you're coming from. I think different ingredients can be added to alternatives.
One is for certain more infrastructure, like parts of a digital world. Some alternatives can counterbalance market power in a very strong way. But, of course, this has to comes with what makes them a real alternative. When I asked why the alternative is a success, they said, well, basically because we forced all companies to be interoperable with the solution that had to be good and practical. I think the public sector would replicate. It came up with a good solution and people can use whatever service they want. But the fact that this exist ‑‑ gives much less power to any platform that could be the gatekeeper of digital payments.
That's one solution, I can see. In some areas, I don't know for all.
Certain public infrastructure layers, in terms of cloud and the digital chain.
Then I would say as the government itself, for critical things, I think opensource alternatives are to be promoted. For instance, to me, it should be clear that government offices should be using opensource by default. This could be transmitted to public procurement requirements.
And for those things, from an economic point of view, it doesn't make much sense to be publicly owned. It's a good regulation. I think that's what we're experimenting with. Brazil is make nice advances and regulation, talking not only about an economic one.
This is going on all over the world. Perhaps the European Union was the first one, but the UK, the legislation was already put in place, and Australia is discussing the same, and many other countries are following. I think there will be a nice library of what works and what doesn't. That way, perhaps being the second movers, it's an advantage for countries like Brazil because they can already learn from the mistakes we will make for sure in the European Union.
But local champions, yes, I think you are right. The DMA explicitly targets ‑‑ I mean, not by designing the country to the Digital Services Act, it doesn't require ‑‑ given the threshold, it ends up being part of the digital markets, and it ends up being usually European‑wide platforms or even international ones.
That doesn't mean there couldn't be any fostering of champions. There's discussion going on. Some may have seen it in the world, in the European Union, about digital industrial policies.
For example, AI, there's a battery of new legislation. The pipelines are already ‑‑ strategic plans are already in about how you can foster those champions in the AI chain.
So I see those two as complimentary types ‑‑ again, this is more of a practicality point of view. This takes a lot of time and effort to regulatory. So you have to aim for the people with the high impact. Those are very international.
This question on regulation, Beatriz ‑‑ nice to see you again. The question about the dialogue between types of regulation. It's happening on the inside all right. There's system risks like disinformation. Some platforms are regulated by both. There's a dialogue in two ways. One, in terms of procedures, for example, it's similar to cases, and colleagues are helping each other in DigiConnect. I think there's a lot to learn from the longer experience of competition law. Vice versa, in terms of both ways. I foresee and will see from colleagues work around the whole, in terms of the methods, for example, we've managed ‑‑ as an example because self‑preferencing, the way you can monitor this from another point of view could be in‑house, but the colleagues doing the DSA are developing monitoring harm to users.
So I see those different types of regulation and procedures.
And then the more political level, coming back to the first question, I think there's everything to gain between different jurisdictions and learning from the different designs.
Obviously, they ended up saying, okay, DSA should be one and DMA should be another.
They overlap in the type of platforms they're going to regulatory, but that's an institutional design choice. If some other jurisdiction puts them under the same umbrella, it wouldn't be bad.
There's information we can learn from what worked and what didn't and from previous successes.
I think it's still too soon to tell because things are ongoing. These are highly technical matters that require time just like competition law.
In my personal opinion, it was too late. All the chain of advertising is put in a nice report by the CMA when they did a market study. It's highly concentrated, and that's a problem.
At this stage, what we can expect is do good regulation. If there's any harm, they're non‑economic. I think here is where the DSA, the Digital Services Act, should be.
If eating disorders are promoted, by minors, that's an economic engagement.
>> RAQUEL DACRUZLIMA: Just a small footnote for everyone who is not from Brazil.
Bruno mentioned I'm ‑‑
Camila, you have the floor. You have seven minutes.
>> CAMILA LEITE CONTRI: In uniting the questions, happy to see you people. I think you brought a good example on how networks affect working practice. We're on these platforms because our friends use these platforms. We're on our friends because we have content created for risk. How can we let go of these if everyone ‑‑ sorry. I'm hearing myself ‑‑ if everyone is there. So it seems like a chicken‑or‑egg problem.
How can we move alternatives. Everyone is on the other platforms. So, yeah, this is challenging, but having alternatives can at least make people think more on the possibilities that they could have. Otherwise, we're still continuing the situation that we are enclosed in this kind of platform, but there's a digital literacy work as well that we have to do.
On the questions related to alternatives, I think there are some things we can do right now and some alternatives that we can still promote in a longer term.
The first thing that competition authorities can do is have ‑‑ periods of harm. It's how the competition is interpreted and how they sanctioned a company, considering there has been a competition issue.
So these can happen even interpreting the current competition laws because they are broad enough to interpret in a way that the consequences are market‑related, but, also, they have other rights.
The other is we're too late to interfere in some markets. Maybe this is incentive to use other measures.
So measures that before the procedure ends, they can use that.
So these things, we normally suggest when talking about ‑‑ digital markets, and this can also relate to human rights.
In the longer term, I think to bring the human rights lenses, the good point is a collaboration and the dialogue between authorities. We have one good example in the EU on how a case was on the intersection of data protection and competition law. I personally studied this as part of my Ph.D. not human rights perspectively, but the data protection between consumers and authorities. I think we can apply these to human rights authorities.
In this proposal that we have, there's one suggestion for permanent institutional dialogue to refer to, especially between these authorities that are mentioned. So mostly market‑focused, telecommunications, more traditional authorities. But maybe one good concrete suggestion we can make is maybe having civil society participation, which, for me, seems more essential. Participation in this forum are authorities that can bring economic lenses to this.
And also talking about the Brazilian proposal on how to improve digital markets enforcement, one of the goals of this provision ‑‑ and I'm talking specifically on the ‑‑ one of the goals is promotion of a Brazilian competitive market. So, yes, we are trying to look also on this side.
This may be a good incentive for the private sector to see the importance. Sometimes they see things as bad, but it's about the promotion of vibrant markets. It can benefit human rights.
But the other solution we can have is being a little more intense and more radical and breaking up companies, as we see that they have unmeasurable impact in our lives.
Maybe the solution is that they didn't have to be that big.
That's why I praised the solution presented by Article 19 on unbundling, for example, hosts, curations, services, host curation services.
Another concrete example ‑‑ and I will end here ‑‑ is that I mentioned the judgment on data protection and competition law in the EU. So in the Facebook judgment, it was a decision made by the German ‑‑ authority. It went to the Crowd of Justice.
It was about if Germany could detect a violation as a breach. And they gave good examples of how something could be considered a breech under other law.
The result was that ‑‑ if, yes, they could not depart from this. But they could have their own conclusions.
They could consult them and seek cooperation. If they didn't seek any objection, they could continue with their own case. So, yes, we're thinking about data protection and competition. Why can't we understand human rights impact and bring things in. But these are bold public servants. I'm happy to be on this panel. I see your ability to have these discussions and I hope other authorities, such as the Brazilian ones have this same openness, and I believe so.
>> RAQUEL DACRUZLIMA: Thank you, Camila. You're so precise with your time.
You can make your closing remarks, please.
>> To jump on what Camila was saying, I think that technically speaking, just from an algorithm standpoint and a personalisation standpoint, doing personalisation as it's done on social media, not in terms of advertising but really in terms of user experience, meaning having a personalised feed on whatever social media that we're all using, it's absolutely possible to do it while respecting GDPR and still having a good user experience.
Big Tech and social media platforms have integrated the fact that you need to use any sensitive data, meaning gender, age, whatever, any demographic, a good user experience is not true. It relies on that for perspective but not for user experience.
I think from a private‑sector point of view, I think that could be a lot stronger, and it would probably still not harm some parts of the business, especially if you're looking for a way of ‑‑ and I don't think relying highly on advertisements, the way it is today ‑‑ it brings problems like the platform because let's be realistic, the reason services are free is because they're relying on the subscription. That's the question I think what we're looking for. Are we looking for information? Are we looking on interacting with our friends? And maybe the platforms where we are trying to have the best experience combined in one is probably like not viable in the future.
So, for instance, when talking about Brazil not having its own infrastructure, I think there is also layers between choosing U.S., Google ‑‑ for big media companies. I will say it's a local champion in the way the global decides to push the information on Google Play. They could rely on the Google algorithm or rely on the proprietary algorithm that's brought by vendors like us or it would be developed themselves.
For that, you need intervention for small tech vendors.
I think there's room to incentivise private companies to do more. It's very complicated to be able to create innovative, open‑source solutions that scales for smaller vendors and vendors that care about ethics, I will say.
So to do that, I think it should be like an incentive, in terms of regulation or in terms of intervention. Sometimes I don't have the answer. I'm not a regulator myself, but it's actually just a matter of willing. I think it's not enough to encourage that. And this is, I think, what could be interesting for innovation at scale.
In terms of advertising, of course the market consolidates. I think we're still watching the decline of cookies and still looking for a new way of doing advertising, meaning adding a proper way ‑‑ and technology will help that ‑‑ to explain why an ad is pushing a user away.
Today, it's not enough. As long as the model will be relying on advertising, it will be have complicated to fight against that kind of lobbying from the advertiser. I think we still have a lot to do. Hopefully, yes, I am pro ‑‑
>> RAQUEL DACRUZLIMA: Thank you. I will give the floor back to Camila. You can have one minute for each of you to make a closing remark.
>> CAMILA LEITE CONTRI: One thing I wanted to say is related to clouds. In Brazil, we're still very dependent on Big Tech clouds. This is a matter of sovereignty. We had an issue that the person working in the Brazilian government went to the cloud company and then came back to the government, which can bring some concerns as well.
One was how we can create alternatives.
There would be discussion in Brazil about funding some alternatives, about digital public instructions, for example, and how we can create from small companies but also the public sector beyond regulation, of course.
It was a pleasure to be here. I'm excited to continue these discussions. Thank you so much.
>> RAQUEL DACRUZLIMA: Bruno, would you like to say some final words?
>> BRUNO CARBALLA SMICHOWSKI: I would like to repeat the words about the rest in dialogue cross both disciplines and jurisdictions, and I'm happy to continue.
Thank you for the invitation.
>> RAQUEL DACRUZLIMA: Hannah?
>> Hannah: Forcing laboratory more on the algorithm paralysis. The technical solutions are here. It's a matter of information and willing for the Big Tech to do it. If you're continuing on that, you have to ‑‑ I don't know.
>> Hannah: Perfect.
I would like to invite you to "taming the Big Tech." We have a Portuguese company.
We talked about unbundling ‑‑ would help users to leave the big platforms and there could be other models of business working with curation and offering other kinds of standards for how we interact with our friends and the content we see and have more transparency.
You can check it out on our website.
Thank you, all, so much. I think we can be a bit more radical and bold and maybe tackle Big Tech. Thank you very much for joining us today.
(Applause)