Entries submitted
B1 – Entry by: Autoridade Nacional de Proteção de Dados – ANPD
Description of the initiative:
The initiative involved the creation and public availability of interactive dashboards on the ANPD website, developed by the Security Incident Treatment Coordination of the General Coordination for Oversight (TIS/CGF/ANPD).
The primary objective is to provide information on security incidents reported to the Authority, while also fostering active transparency by enabling the public to monitor and understand the Coordination’s work in analyzing and handling such incidents.
These dashboards are updated in real time and offer stakeholders a clear and categorized view of key information, such as:
- Location where the security incident occurred (federative unity) – Estado.
- Public or private sector involved – Setor.
- Market segment – Segmento.
- Type of incident – Tipo de incidente.
- Type of communication (preliminary, supplementary or complete) – Tipo de comunicação.
In addition, there is a dashboard that specifically addresses the Incident Investigation Procedures (PAI) investigations initiated ex officio, triggered by complaints, or media reports — in which ANPD investigates whether a security incident occurred. The data covers the period from 2021 to the current year, providing both a historical and up-to-date overview.
The page hosting the dashboards was completely redesigned to enhance user experience, adopting a more user-friendly and functional interface for the public.
With this initiative, the Security Incident Treatment Coordination (TIS) strengthens communication with society, expands access to information and prioritizes best practices in transparency and accountability in public administration.
Most importantly, it advances the continuous pursuit of improvement and innovation in public services, always aiming at the effective implementation of public policies within the institution.
The central idea behind the development of the interactive dashboards emerged as a response to the high volume of information requests regarding security incidents received by ANPD — whether through the Ombudsman via Fala.br, or via emails sent directly to the Coordination.
Why the initiative deserves to be recognised by an award?
This initiative represents a significant advancement in the sharing of information related to security incidents, strengthening active transparency and accountability to society, especially considering the sensitive nature and public interest surrounding the topic. By providing up-to-date data directly to interested parties, ANPD demonstrates public confidence in its work, offering valuable insights for citizens, researchers, the press, and organizations concerned with the subject.
Interactive dashboards enhance data navigation and understanding, making information accessible to all audiences. They empower citizens to verify whether personal data breaches have occurred in their respective Federative Units (UF/Estado), enabling them to assess potential risks involving organizations with which they have relationships.
Such efforts also reflect institutional maturity by documenting and publicly communicating ANPD’s ongoing work in the investigation and handling of incidents.
In addition, the user-centered redesign of the webpage, combined with the continuous maintenance of up-to-date information, positions the initiative as a potential model of best practices – one that can be replicated by other data protection enforcers in Brazil and internationally.
B2 – Entry by: Autoridade Nacional de Proteção de Dados – ANPD
Description of the initiative:
Between June 2023 and March 2024, ANPD and the Executive Consulting Unit “Simplifica” of the Ministry of Management and Innovation in Public Services worked on mapping and redesigning the “Request Handling” process, under the responsibility of the Monitoring Division within the General Coordination for Supervision.
This initiative took place within the scope of another program that offered mentoring for process simplification projects across the Federal Government.
To that end, the following methodology was applied:
(1) Process prioritization;
(2) Schedule (WBS);
(3) Diagnosis;
(4) Process modeling;
(5) Problem and Solution Matrix;
(6) Process redesign;
(7) Standardization;
(8) Creation of artifacts;
(9) Automation;
(10) Final report.
The pilot project was implemented in February 2024.
The “Request Handling” process deals with submissions by data subjects reporting violations of their rights or breaches of the LGPD.
One of the main challenges faced by data subjects was the need to register in ANPD’s electronic system in order to file a request. This process was slow and involved:
(1) filling out an online form,
(2) emailing a signed statement and copies of identification documents,
(3) verification of the information by the protocol team, and
(4) creation of a password.
If everything went smoothly, it could take up to three days—time the individual had to wait before being able to officially submit a request. In short, it was a procedure that discouraged people from submitting their claims. At that time, ANPD received an average of 120 requests per month.
The old system used for receiving requests was not integrated with other platforms and was not intuitive or widely understood by citizens.
The new service, along with the solutions implemented during the process redesign, has positively contributed to the efficiency and transparency of internal workflows, enabling clearer communication, operational execution, and decision-making.
Following the implementation of the service, ANPD began receiving an average of 750 requests per month—a 525% increase.
The most significant changes involved simplifying the registration process and improving the request form.
The service now accepts the federal government’s unified login system (with over 150 million registered users) for accessing digital public services.
Why the initiative deserves to be recognised by an award?
In the words of Minister Hélio Beltrão (Decentralization and Freedom, 2002), “Landing in the real Brazil involves […] simple and inexpensive solutions, tailored to our realities […] and, above all, to the low standard of living of the majority of our people.”
The collaborative network-based approach has enabled the rational use of public resources, preventing rework and waste.
The service not only improved the institution’s internal processes but also added value to the Public Administration and, most importantly, to society—resulting in a more efficient, transparent, and responsive service. It reflects the commitment of the Brazilian public sector to building a fairer and more inclusive society.
By integrating with the federal government’s unified login system (with over 150 million registered users), the new service has made it easier for data subjects to exercise their rights and communicate with the ANPD. This progress is evident in the number of requests received. Following the implementation of the service, ANPD went from an average of 120 requests per month to 750—an increase of 525%.
Alongside the new service, ANPD also redesigned its webpage, providing step-by-step instructions in plain language on how to submit complaints or petitions (see link 1 in item e).
B3 – Entry by: Autorité de Protection des Données Personnelles (APDP – Monaco)
Description of the initiative:
Named Céos, “he who thinks”, in reference to the Titan of intelligence in Greek mythology, the APDP’s virtual assistant can answer the users’ questions in many different languages about personal data protection and security in the Principality of Monaco.
Developed in a dedicated, secure, isolated environment hosted in France, Céos does not collect any personal data. The users’ IP address is not collected. Conversations are encrypted and anonymized. They are kept for a maximum of 7 days to improve the operation of Céos. Only administrators have access.
The Céos operating database is made up mainly of information available on the APDP website.
Deployment is to take place in 2 stages.
In the 1st phase (current phase), it provides users with quick detailed answers on Law no. 1.565, which was recently passed (December 3, 2024) and data protection in the Principality. In particular, it informs data controllers of their obligations and individuals of their rights. It also tells them where to find the various documents available on the brand new APDP website (practical information sheets, model letters for lodging complaints, model registers of activities and processing, etc.).
In a second phase (end 2025), it will support data controllers in achieving compliance. By means of simple questions to answer and direct links to relevant information documents (definitions, examples, practical information sheets, etc.), it will guide them in filling in the register of processing activities and the register of data breaches, as well as in carrying out their impact analysis.
Why the initiative deserves to be recognised by an award?
Law no. 1.565 of December 3, 2024 governing data protection in the Principality of Monaco is very recent, and people – data controllers and individuals alike – have a lot of questions. In early 2025, the APDP launched a new website, which it regularly updates with documents (fact sheets, guides, etc.) and tools (forms, register templates, etc.).
To make this information more accessible and easier to use, the APDP decided to equip itself with an AI virtual assistant that enables users to quickly find available information and all the help they need to fill in the various documents, all in strict respect of their privacy since not personal data is collected. Its operating database is made up mainly of information available on the APDP website.
The aim is not to replace the legal or technical advice provided by APDP agents, but to use AI technology to help users navigate the site and make the most of the tools at their disposal.
B4 – Entry by: Croatian Personal Data Protection Agency
Description of the initiative:
Although the General Data Protection Regulation (GDPR) has been in force since May 2018, achieving full compliance remains a significant challenge, particularly for small and medium-sized enterprises (SMEs). To address these challenges, the Croatian Personal Data Protection Agency, in cooperation with its partners, developed Olivia: an innovative, open source, user-friendly, and interoperable digital tool specifically designed to support SMEs throughout their GDPR compliance journey.
Olivia offers a comprehensive package of educational and practical resources. It includes fifteen data protection courses that address all key obligations of data controllers and processors as defined by the GDPR. Each course consists of both theoretical and practical components. In the theoretical part, users can explore lessons explaining specific GDPR obligations, view educational videos, and take quizzes to assess their knowledge. The practical modules provide data controllers with templates and tools to generate internal documentation that demonstrates compliance and accountability. Additionally, the Olivia platform hosts twenty webinars covering a range of data protection topics. These webinars are permanently accessible and free of charge to all interested stakeholders.
Olivia is a virtual teacher and assistant at the same time. Olivia contains a small online academy that offers to SMEs, but also to all data controllers, a series of learning modules to improve their knowledge in the field of personal data protection, and also serves as a practical tool to help organisations create internal documents to prove their compliance and accountability. It was successfully launched in 2024 and will be regularly updated to ensure its continued relevance and effectiveness. The Croatian DPA is now working on the development of modules on the interplay between GDPR and Artificial Intelligence.
To further support users, a detailed user manual, handbook, and an instructional video have been developed and uploaded to the Olivia platform to serve as a lasting educational resource. The “Olivia” digital tool has empowered SMEs, but also data protection officers across the EU, to improve GDPR compliance through user-friendly support, educational resources, and international collaboration. It enhances SMEs’ expertise, encourages a culture of privacy, and promotes EU-wide engagement through its open-source, multilingual design. Olivia is adaptable and scalable, enabling the seamless integration of new modules and language versions to support GDPR compliance across diverse national contexts.
Why the initiative deserves to be recognised by an award?
Olivia deserves recognition because it represents a pioneering, practical, and sustainable response to a genuine need among SMEs for GDPR compliance support. Despite being in force since 2018, the GDPR remains challenging, especially for smaller businesses with limited resources. Olivia bridges this gap through an open-source, interoperable, user friendly digital tool that combines high-quality educational resources with practical compliance support, empowering SMEs to meet their legal obligations confidently and effectively.
The initiative goes beyond traditional training by offering fifteen structured data protection courses, practical templates to generate internal compliance documents, twenty permanently accessible webinars, and educational videos, all freely available in English. This innovative approach fosters a culture of privacy, strengthens the data protection ecosystem, and supports the consistent application of GDPR principles across various national contexts.
Moreover, Olivia promotes international cooperation and future-proofs its impact by enabling seamless integration of new modules. By combining education, practical tools, and international collaboration, Olivia sets a unique and replicable standard for raising awareness and improving compliance across the EU and wider. This makes Olivia truly worthy of recognition as an outstanding and innovative data protection initiative.
B5 – Entry by: European Data Protection Supervisor
Description of the initiative:
In response to the rapid pace of development of artificial intelligence, and increasing risks to fundamental rights on large online platforms, countries around the world are passing laws that intersect with privacy and data protection frameworks. Some of these laws provide the various competent authorities with new tools to promote a sustainable and rights-oriented digital economy. However, they also lead to parallel investigations by various authorities into the same practices of the same entities, with a potential for regulatory conflicts and inconsistencies in relation to data-related practices. Therefore, the EDPS observes a need for greater cross-regulatory cooperation to avoid an inconsistent application of legal requirements in this complex landscape.
To this end, the EDPS has identified key areas to work on, based on current initiatives rolled out in the EU and beyond and the feedback received from various stakeholders. This encompasses the need for a coherent and consistent application of EU law in the digital economy, in particular of the so-called ‘EU Digital Rulebook’ (including the Digital Services Act, the Digital Markets Act, the Data Act and the Artificial Intelligence Act); the need for cross-regulatory cooperation between competent regulators; and the need to uphold data protection as the backbone of this digital regulatory framework.
Building on an earlier experience that ran from 2017 to 2021, the EDPS proposes the establishment of a Digital Clearinghouse ‘2.0’ that would provide authorities and bodies with a forum to exchange and coordinate on issues of common interest. This forum should facilitate proactive, collaborative efforts among participating authorities to address potential issues before they become practical problems, ensuring that different authorities are aligned on goals, methods, and responsibilities to avoid duplication of efforts or inconsistencies in their actions.
A Digital Clearinghouse 2.0 should promote cooperation in ‘variable geometry’, providing relevant authorities, bodies and networks the flexibility to join only discussions and working groups on issues where they have or need relevant expertise. This Clearinghouse should have a permanent Secretariat to assist in the timely delivery of concrete outcomes, such as joint statements and guidelines that garner each participant’s expertise. The Digital Clearinghouse 2.0 should also become a forum where participating authorities lawfully share information about their ongoing enforcement actions.
Why the initiative deserves to be recognised by an award?
The EDPS’s initiative acknowledges the proliferation of legal requirements that companies operating in the digital economy need to comply with – data protection being key among them – and proposes a pragmatic solution for the various competent regulators to align and increase legal certainty.
The Digital Clearinghouse 2.0 would be a forum to promote cross-regulatory cooperation at EU level, building upon initiatives for cross-regulatory cooperation that are operating in different regions (Australia, Canada, the UK, Ireland, the Netherlands, France, and Germany). This initiative is aligned with the strategic objectives of the GPA to:
- Map cases of intersection between personal data protection, competition, consumer protection, and other intersecting regulatory spheres;
- Identify barriers to cross-regulatory cooperation and develop or advocate for solutions where they do not exist;
- Encourage and facilitate greater bilateral or multilateral cross-regulatory cooperation between DPAs and other regulatory authorities.
This proposal of the EDPS feeds the current discussion between the European Commission, the European Parliament and EU Member States on how to ensure simplification and competitiveness for businesses. One of the ways to pursue such goals is through enhanced dialogue, cooperation, and coordination among regulatory bodies to ensure a predictable and effective legal environment that places fundamental rights at the core.
B6 – Entry by: Garante per la Protezione dei dati personali (GPDP)
Description of the initiative:
In 2020, the GPDP launched a pilot program to prevent the dissemination of intimate content on social media platforms. This initiative marked the first institutional attempt to use technology in support of victims of non-consensual pornography, focusing on the early identification and blocking of sexually explicit content before it could be shared online.
To this end, the GPDP also established a dedicated internal taskforce to handle cases involving the non-consensual disclosure of intimate images and developed a fast-track emergency procedure to prevent their dissemination.
With the adoption of Law No. 205/2021 on “revenge porn,” this procedure was formally recognized and incorporated into national legislation, giving the GPDP an explicit legal mandate to act in this area. The law amended the Italian Privacy Code (Legislative Decree No. 196/2003) by introducing Article 144-bis, thus consolidating the Authority’s role in protecting individuals from the unlawful sharing of sexually explicit images and videos without their consent.
The initiative adopts a preventive and victim centered approach. Individuals can submit a report via a simplified online form, accessible without legal assistance. The GPDP promptly assesses each case and, where appropriate, issues urgent measures within 48 hours to prevent the dissemination of the content. To ensure maximum confidentiality, materials are processed and shared with platforms in hash format only.
This tool is available to both adults and minors, with particular attention to the heightened vulnerability of younger users. Accessibility, confidentiality, and timeliness are its defining features, enabling intervention before harm occurs.
The GPDP has also established direct, structured channels with major online platforms and digital service providers, ensuring prompt compliance with removal or blocking orders. In parallel, it has promoted awareness campaigns and educational activities – especially in schools – aimed at fostering a culture of respect, consent, and digital dignity.
Since June 2024, reports submitted through the dedicated channel have increased by 70%, reflecting growing public trust and awareness. The initiative has enabled hundreds of timely interventions and has become a key reference point for those seeking immediate protection from digital abuse.
Why the initiative deserves to be recognised by an award?
This initiative is a clear and effective response to a serious and growing problem: the non-consensual sharing of intimate images online. It is a harmful form of digital violence, often targeting women and minors, with serious emotional and personal consequences.
The GPDP created a simple, fast, and accessible tool that allows people—even very young users—to act quickly and stop the spread of such content before damage is done. It’s a practical system that works and has already helped many individuals.
Since June 2024, reports have increased by 70%, showing how urgent the issue is and how valuable and trusted this initiative has become.
This project shows that a data protection authority can play a key role not only in enforcing rules, but also in protecting people’s rights and preventing real-world harm.
The approach is innovative and, with the ongoing development of technology, is expected to deliver increasingly concrete results. It can serve as a model for other countries. This initiative combines efficiency, speed, and a strong focus on individuals, contributing to the creation of a safer and more respectful digital environment.
B7 – Entry by: Hellenic Data Protection Authority
Description of the initiative:
Τhe Hellenic Data Protection Authority developed a comprehensive privacy education initiative specifically tailored for children, focusing on the safe and informed use of online services, as part of the project ‘byDefault‘, funded by the European Union’s CERV program.
In its initial phase, the project developed educational resources that featured clear learning objectives and age-appropriate messaging. These materials were then evaluated and refined to ensure pedagogical effectiveness, incorporating a variety of learning methods to engage diverse student needs and learning styles.
An educational tool was developed to train primary and secondary school students, with the goal of strengthening their understanding of privacy and data protection. This tool is a hybrid physical-digital augmented reality (AR) game called Tzimanious (meaning “smart cookie”). Through gameplay, students learn to navigate the Internet wisely and cleverly, gradually developing a form of digital expertise.
The AR game combines both physical and digital (“phygital”) features: it consists of physical components, such as a board, pawns and cards, as well as digital elements (an app must be installed on the mobile or tablet to be used during the game process) and aims to make students aware of how to protect their personal data. It is played by 2 to 6 players or groups. The goal of each player/group is to move their pawn through the eight stations of the game, answering questions about personal data and collecting as many diamonds as possible.
Ultimately, the AR game and the accompanying educational material are expected to be incorporated into the school curriculum at both primary and secondary levels. This development stems from a proposal submitted by the Hellenic Data Protection Authority to the Minister of Education, who proved to be an enthusiastic supporter of the initiative.
Furthermore, as part of the project, a training and support program for teacher development has been created in order to establish a culture of responsibility and respect for personal data within the educational community. This is achieved by enhancing teachers’ knowledge and skills, thereby increasing their ability to promote these issues among their students.
Why the initiative deserves to be recognised by an award?
The interactive AR game combines a traditional board game format with modern technology to keep children engaged and encourage organic peer-to-peer sharing, thereby enhancing learning outcomes.
It addresses key topics such as Internet and social media use, the concept of personal data, and risks related to sharing children’s data online. It also offers practical guidance on navigating social networks, recognizing suspicious behavior, understanding cookies, and identifying manipulation tactics online.
The game was pilot-tested in real classroom settings after a 4-hour webinar that trained participating teachers. Over 500 students from more than 20 classrooms across public and private schools in Greece took part in the pilot phase. Results showed a positive impact on both learning and classroom dynamics.
Combining Augmented Reality (AR) with Game-Based Learning (GBL) proved pedagogically effective: AR enables interaction with real-world learning objects like maps and books, while GBL introduces a playful element. Together, they create an immersive and engaging educational experience.
The game is ready for use in schools, and the HDPA plans to make it accessible to any other interested stakeholders.
B8 – Entry by: Information and Privacy Commissioner of Ontario (IPC)
Description of the initiative:
The Office of the Information and Privacy Commissioner of Ontario launched the Transparency Challenge to encourage government openness and provide a unique and creative forum for institutions to showcase their innovative projects that advance open data and government transparency in ways that improve Ontarians’ lives.
This year’s showcase focuses on model ways that government institutions are building trust with their citizens by balancing privacy and transparency in the way they collect, use, and disclose personal information.
The exhibits are each represented by a unique piece of artwork. This collection has been specially curated to shine a light on best-in-class efforts in transparency and access to government information for the benefit of Ontarians.
The IPC’s Transparency Showcase illustrates the importance of access rights, transparency and open government. The virtual gallery offers visitors a chance to browse the projects through captivating audio and video, graphics, and descriptions that bring the initiatives to life.
The breadth and quality of exhibits from across Ontario’s public institutions, including, provincial ministries, municipalities, schools, universities, and police services, provide inspiring models for other institutions to follow.
Featured exhibits include:
- City of Toronto’s Public Walking Tour educates the community on sensor technology and privacy implications through an interactive walk in the Entertainment District.
- Town of Innisfil’s Technology in Public Spaces provides interactive signage to inform residents about technology embedded in public spaces, such as sensors in park waste bins.
- McMaster University’s “AI Dialogues” podcast series explores the ethical and practical questions of generative artificial intelligence (AI) in higher education, complementing consultations that are shaping new guidance.
Today, more than 30 projects are featured in two virtual galleries, providing a remarkable range of initiatives aimed at increasing understanding and appreciation for open data and access to government information. The goal is to inspire others towards greater transparency as well.
Why the initiative deserves to be recognised by an award?
Transparency is about empowerment and helps build public trust. It equips people with the information they need to participate meaningfully in the democratic process, engage in constructive discourse, and hold their governments accountable. It’s the bedrock that democracy is built on, inspiring public trust and providing trustworthy, evidence-based information to shape public policies, programs, and services that improve peoples’ lives.
As data protection regulators, our role is not only to sanction bad behaviour, but to encourage good behaviour too. The IPC’s Transparency Showcase celebrates the beauty and benefits of government transparency and open data for the day-to-day lives of Ontarians, inspiring other institutions towards greater transparency too. Ultimately, we believe this unique initiative builds a culture of compliance and underscores just how important transparency is to a healthy democracy.
B9 – Entry by: Information Commissioners Office
Description of the initiative:
We recognised that social media and video streaming is a rapidly evolving market. We first identified apps of interest. We selected 34 social media and video streaming platforms with Terms of Services which allow under 18s to use them. We focused on creating accounts for a 13-17 year old and attempted sign-up as an under 13 year old so that we could have better understanding of the real-life experiences of children.
Over 4 weeks of testing we created new user accounts using proxies for children of different ages to replicate the sign-up processes that children would follow. We recorded the steps a child would need to take to set up an account, the default settings platforms offered, any privacy information provided to users and basic app functionality (including making a post).
Prior to the research started we created a walkthrough methodology, template log to record actions and series of proxy users details and email addresses. We did not interact with other users. We used proxy information to create each account with a different persona and individual contact details. We trained a small multidisciplinary team to undertake testing on real devices.
We used a mix of Android and iOS devices to test real-world experience. We logged each action we took to create a written time-stamped record, in addition to making screen recordings and screenshots. This methodical approach created a baseline of understanding from which we could also assess any future changes made.
Once data was recorded it was assessed and RAG rated over 4 different areas:
- Targeted advertising,
- Accounts being private by default,
- Geolocation settings, and
- Age assurance measures.
In our first tranche of testing we undertook 92 separate tests. We used this work to focus engagement and regulatory action. We also created a comparison table which we’ve published providing information to the public on each platform.
We’ve subsequently developed this work in certain cases setting up multiple proxy accounts to understand what data is processed and shared when users interact with each other. We’ve also developed a framework to assess harmful material that we’ve observed through testing.
Why the initiative deserves to be recognised by an award?
This work has added to the current understanding of what children are doing online and directly links that to where we can improve the landscape. This has provided us with real world experience which framed our understanding alongside other academic, regulatory, governmental and civil society sources.
Transparency has been a key driver of this work. Not only has it informed strategy and engagement, it’s also been used to inform the public and push publicly for change. Through this work we’ve published high level findings and a comparison table.
We created a methodology for testing that is robust, replicable with a fast turnaround. We were able to manage this process with existing internal resources – something that
other DPAs could do.
We have a much better understanding of what we hear from stakeholders as we’ve directly experienced it. This has allowed us to move at pace and be targeted in our work, focussing on areas of highest impact and allowed us to secure tangible changes across a range of areas (including targeted advertising, geolocation processing and default privacy settings). This also provides us with a better understanding of the impact of our work on real life experiences.
B10 – UK Information Commissioner’s Office
Description of the initiative:
Generative AI poses novel challenges to people’s information rights and to the application of data protection law. These challenges include:
- the vast scale of web-scraping that occurs to build some of the most widely used datasets for training generative AI models – often without people knowing their data has been used in this way;
- the purpose(s) that people’s data is used for and how these are determined and justified;
- the accuracy of the data used to train these models, and the accuracy of outputs produced by them;
- how people can exercise their information rights, particularly if they don’t know their data has been processed in the first place; and
- who is responsible for complying with data protection law when models are accessed by deployers in different ways (such as through an application programming interface or by downloading an openly available model).
To understand these challenges better and test our thinking on how data protection law applies in these circumstances, we wanted to engage widely and put our initial approaches to each challenge into the public domain. In 2024 we ran a consultation
series, gathering views from the public, tech companies, legal firms, the creative industries and trade bodies. We sought input on the following:
(1) the lawful basis for web-scraping to train a generative AI model
(2) purpose limitation in the generative AI lifecycle
(3) accuracy of training data and model outputs
(4) engineering individual rights in generative AI models, and
(5) allocating controllership across the generative AI supply chain.
We published the results of the consultation in December 2024. We retained our positions on (2), (3) and (5), and refined our positions on (1) and (4). A key finding of the consultation was a serious lack of transparency, especially in relation to training data within the industry, which the consultation responses show is negatively impacting the public’s trust in AI. Without transparency, it is hard for people to exercise their information rights and hard for developers to use legitimate interests as their lawful basis to use web-scraped data to train their models.
Why the initiative deserves to be recognised by an award?
This consultation and final report represented the first detailed guidance from any data protection authority on generative AI. Our open, engaging consultation process meant that this was well-received by stakeholders including AI developers, the creative industries and civil society. Our positions have been mirrored in subsequent EDPB guidance, and the Spanish data protection authority has translated the section on ‘Tackling Misconceptions’ into Spanish (see here).
The iterative nature of the consultation provided a wide range of stakeholders with space to consider each issue in depth. It also allowed the ICO to understand each issue from a variety of viewpoints, and enabled us to consider the impact of generative AI and data protection law on different sectors.
As generative AI scales across the economy, our final report enables actors across the generative AI chain to understand how data protection law applies to this novel technology, allowing them to innovate in a compliant way.
B11 – Entry by: New Zealand Office of the Privacy Commissioner
Description of the initiative:
New Zealand does not have specific rules for biometric information. OPC is proposing to create some, by a code of practice under the Privacy Act 2020. Our challenge was to consult both a legal and non-legal audience on our draft exposure code (a technical document).
We had several challenges:
- We are a small office with limited resources for this work.
- We needed to talk to a wide group of people about a technical issue.
- We knew that biometric information was tapu (sacred) for Māori (New Zealand’s indigenous people) and we needed to take special care to listen to this group.
- We didn’t have money for design and had to work with a website that wasn’t modern.
We created a hierarchy/ layers of information that people could engage with at their level. This included the most technical (the code itself), a detailed consultation document written in plain language, an infographic that presented the code as a graphic, and a one-page consultation, that centred around summarising the main changes of the code into three questions. We used inhouse skills and the organisation’s Canva account.
Because we were a small team we front-footed questions with a clear banner on our web page and a detailed autoreply message, to ensure time was spent well.
We met face-to-face with Māori stakeholders to make sure we heard their concerns appropriately. We also worked to develop detailed stakeholder lists that were highly segmented with bespoke messaging to spark the interest of our many user groups: government, business, legal, health, NGOs and civil liberty groups, and individuals that had self-nominated to be notified when consultation opened.
Our work was supported with a media campaign, launching with a 20-minute interview with the Privacy Commissioner on RNZ, our national broadcaster.
During the four-week consultation period our biometrics web page had over 3000 unique visitors.
Our goal for success was 50 submissions from the public and 50 from experts or organisations. As a result of this campaign, we received 70 submissions from experts and organisations and 179 submissions from individuals. Their feedback will inform the design of a final biometrics code.
Why the initiative deserves to be recognised by an award?
Biometric technologies are likely to become part of every New Zealander’s life, but many do not know that yet. As an Independent Crown Entity, we could have written a legal document and then let the experts comment. However, we chose to widen the circle and include, through clear and plain language and an engagement plan, a wider range of people who will ultimately be affected by this work.
This approach, especially the activities like creating an infographic and distilling the code to three core questions, was a new and at times challenging way of working for the team. However, by all pulling together for a common goal we were able to present a technical document in a way that was accessible and therefore received a wider range of submissions.
New Zealand is known as a country of people who are innovative. We took that spirit, and that of our Noble Prize-winning chemist Ernest Rutherford who famously said, “We haven’t got the money, so we’ll have to think.”
Our exposure draft is rightly a very technical legal document and OPC presented it in several ways to ensure that it could be understood and engaged with by a large audience.
View more information.
B12 – Entry by: Spanish Data Protection Authority (AEPD)
Description of the initiative:
Created and launched in 2023, this initiative proves that age verification on the Internet can be executed without endangering children to targeted attacks or infringing on individuals’ data protection rights.
Our initiative champions an innovative approach where child protection does not require identifying children or collecting data from them. Instead, the responsibility lies with adults to prove that they have permission to access adult content. This approach automatically safeguards children without requiring any action from them or their devices, ensuring they cannot access harmful content.
By adhering to the set of proposed principles, which are derived from the GDPR, the implementation of this approach would effectively uphold the fundamental rights of citizens on the Internet. It would protect their anonymity and shield them from any unlawful processing of their personal data.
Moreover, this approach leverages existing identity documents, eliminating the need to create new identity infrastructures. This preserves individuals’ right to their own identity and allows for universal implementation across different countries.
Summarizing this initiative:
First, provides a risk assessment of the available age verification systems (released as an infographic) to establish a Decalogue of principles that particularizes the GDPR principles to this application domain.
Second, implements three different proofs of concept (PoCs) demonstrating that compliance with this Decalogue is possible and that the proposed approach could already be offered with a clear separation between identity management, content filtering and the age verification itself. These PoCs show that age verification can be performed on the data subject’s device, which has complete control over their identity and age data and allows for fully auditable and transparent solutions. The implemented PoCs can be seen in these videos:
– PoC on for PCs and consoles (Windows)
– PoC for smartphones (Android)
Third, is the key element of an ambitious Global Strategy on Children, Digital Health and Privacy promoted by the Spanish DPA that includes 35 measures focusing on education, digital health and well-being.
Why the initiative deserves to be recognised by an award?
This initiative is committed to children’s protection, aligning data protection rights and evidence-based innovation to improve online safety standards. Recognizing this initiative with an award highlights the importance of this alignment and encourages further development.
The initiative has been already awarded at a national level, for example:
- Data Cybersecurity Award Socinfo Digital Awards (February 2024).
- Public project award on II National Computer Awards (March 2024).
- Public sector award at @aslan Awards (April 2024).
However, its impact extends beyond national borders, and the initiatives’ success resonates globally. This allows the AEPD to contribute to a safer digital environment by collaborating with the ISO (in the elaboration of the 27566 standard), the European Data Protection Board (drafting a new statement) or the European Commission (participating in the Task Force on Age Verification under the Digital Services Act), to mention only some significant examples.
Since the initiative focuses on actionable steps, we are also collaborating with both, the Spanish and European pilot projects to provide harmonised solutions for age verification based on our initiative. Furthermore, significant efforts have also been made in dissemination and awareness, actively sharing knowledge through different conferences and scientific publications (at the Annual Privacy Forum 2024).