Understanding Europe’s Big Six Data Protection Regulators

Not all data protection regulators are the same. Each of the European Union’s 27 General Data Protection Regulation (GDPR) regulators have different approaches to enforcement. It is important to understand the key differences in their powers, priorities, procedures and ways of working. Together, they form the European Data Protection Board (EDPB). However, their differing approaches strongly influences the nature, tone and substance of the EDPB’s output. We call this group of GDPR regulators “Europe’s Big Six” because of their enforcement output, capabilities, consistency, size and influence.  These regulators help to set the agenda for GDPR, data protection and information security, in Europe and around the world. They have become bellwethers. Analysing these regulators can help companies and organisations navigate the enforcement landscape, understand regulatory risks and decide on how to effectively engage with them. The Big Six Regulators are the European Data Protection Board (EDPB), France’s CNIL, Spain’s AEPD, Germany’s Hamburg and Baden-Württemberg regulators (together), Italy’s Garante and the Netherlands’ AP.

EU: European Data Protection Board (EDPB)

The EDPB is the EU’s data protection super-regulator, bringing together all 27 GDPR regulators, EFTA EEA authorities and the European Data Protection Supervisor (EDPS). Its powers are distinct from each country’s regulator, but it has formidable convening powers. The EDPB helps to resolve EU cross-border cases. The Dispute Resolution mechanism is a very powerful system allowing a decision by one GDPR regulator to be reassessed and fines or penalties increased, because of interventions by other EU GDPR regulators. EDPB’s Opinions are highly regarded and help to shape EU data protection interpretation and good practice. The EDPB also publishes Guidelines, Recommendations and Best Practice. The Board can use its Urgency Procedure to assist a GDPR regulator to adopt an urgent measure needed to protect the rights and freedoms of individuals. The EDPB can, on its own initiative, intervene to monitor the correct application of the GDPR, advise the European Commission, answer questions on GDPR application, give Opinions on Codes of Practice and Certifications, give opinions on data protection adequacy, promote co-operation, exchange information and facilitate shared investigations. The EDPB’s Strategy and Work Programme, Annual Report 2021, Vienna Statement on Enforcement Co-operation 2022 and Selection Criteria for Cases of Strategic Importance 2022, show an intent to increase future cooperation and effectiveness.

France: Commission Nationale de l’informatique et des Libertés (The CNIL)

The CNIL, based in Paris, was established in 1978, before European data protection law was comprehensively set out.  It is one of the larger GDPR regulators, with an established heritage in privacy and data protection. It has a long history of enforcement, which has been bolstered by the GDPR. In 2019, the CNIL fined Google €50 million for numerous GDPR breaches on transparency and consent. In 2022, it published a decision about Google Analytics’ non-compliance with GDPR, which sent reverberations across the EU and the world. The CNIL’s Strategic Plan 2022-2024 focuses on promoting the control and respect of individuals’ rights, promoting the GDPR as a trusted asset for organisations and prioritising targeted regulatory actions for high-stakes privacy issues. The high-stakes areas of focus include smart cameras, data transfers in the cloud and smartphone data collection.

Spain: Agencia Española de Protección de Datos (AEPD)

The AEPD, based in Madrid, is Spain’s national data protection regulator, established in 1993. Spain also has three regional data protection regulators in Andalusia, Basque and Catalonia. The AEPD is known for its opinions, guidance and tools on emerging technologies such as Big Data, the Right to be forgotten, wifi data collection (Google Street View), cookies, data breach and Data Protection Impact Assessments (DPIAs).  The Agency is one of the most frequent issuers of GDPR fines in the EU. These penalties are often relatively modest (below €200,000), but are spread over a wide range of sectors, industries, public bodies and size of organisations. Its in-country regulatory reach is one of the EU’s broadest and it has adjudicated and enforced in many sectors and technologies. Its largest fines have included Vodaphone Spain (€8.5 million), BBVA (€5 million) and CaixaBank (€9 million and €3 million). Its Annual Report 2021 (in Spanish) shows an active and engaged organisation.

Germany: Hamburg and Baden-Württemberg

Germany has a two-tier data protection regulatory system. The German Federal Commissioner for Data Protection and Freedom of Information (Bundesbeauftragte für Datenschutz und Informationsfreiheit – ‘BfDI’) represents Germany at the EDPB. Germany has about 19 different federal and regional data protection authorities responsible for monitoring data protection implementation throughout Germany. To ensure consistency, members of all German regulators for the public and the private sectors form the Data Protection Conference (Datenschutzkonferenz – ‘DSK’). This arrangement mirrors the consistency mechanism set out in the GDPR, practiced by EDPB. German data protection enforcement is best understood by reviewing the output of these regional GDPR regulators.

Two of the most active and vocal German regulators are Hamburg (Der Hamburgische Beauftragte für Datenschutz und Informationsfreiheit –  HmbBfDI) and Baden-Württemberg (Der Landesbeauftragte für den Datenschutz und die Informationsfreiheit Baden-Württemberg – LfDI). The Hamburg GDPR Regulator fined Hennes & Mauritz Online Shop A.B. & Co KG, a Hamburg-based subsidiary of Swedish fashion and textile company H&M, €35 million, for GDPR breaches. The Regulator imposed a fine of €51,000 on Facebook Germany GmbH for failing to notify their Data Protection Officer in Germany. The regulator questioned and investigated the GDPR compliance of Zoom and Google Analytics. In the early months of GDPR coming fully into force, the Baden-Württemberg GDPR regulator imposed a fine of €20,000 on a social media provider for breaching GDPR’s data security obligations. In 2022, the regulator published a detailed Frequently Asked Questions (FAQs) on cookies and similar technologies.

Italy: Garante per la protezione dei dati personali (The Garante)

The Garante, based in Rome, is Italy’s national data protection regulator, established in 1997. The Garante appears to be very selective about the large scale interventions and enforcement actions it takes. However, many of these actions address the most important GDPR principles, target key sectors and focus on new and emerging technologies. It has been involved in some of the largest and most high profile GDPR enforcement cases and fines such as TIM SpA (€27.8 million), Enel Energia (€26.5 million) and Clearview AI (€20 million). The Garante is permitted to keep 50% of the fines it collects for its own operations. The other half goes to the Italian Government’s central funds. The Garante has agreed with the French, Danish and Austrian GDPR regulators that Google Analytics’ personal data collection and transfers from the EU to the USA, breaches GDPR. The Garante’s Annual Report 2021 (in Italian) shows a bold and confident regulator.

Netherlands: Autoriteit Persoonsgegevens (AP)

Autoriteit Persoonsgegevens (AP), based in Den Haag (The Hague), was originally called the Registratiekamer, and later, the College bescherming persoonsgegevens (CBP). AP, in its current form, was established in 2016. AP has a statutory duty to assess whether organisations, including government bodies, comply with Dutch Data Protection law. The AP’s Strategy Focus Areas for 2020-2023 has deliberately prioritised three digital society themes. These are data trading, digital government in central and local authorities as well as artificial intelligence and algorithms. AP fined the Netherlands Tax Administration (Belastingdienst) €3.7 million for misusing their Fraud Signalling Facility blacklist and breaching the GDPR, causing loss and damage to Dutch families.  AP has also taken enforcement action against the Dutch Ministry of Finance (€2.7 million), TikTok (€750,000) and investigated Microsoft’s services. AP’s Annual Report 2021 (in Dutch) shows a confident, minimalist regulator, with a reputation for applying strict interpretations of GDPR.

Other Related Developments and Trends

The UK’s Information Commissioner’s Office (ICO) and Data Protection Commission Ireland (DPC Ireland), together form the largest native speaking English data protection and GDPR regulatory block in Western Europe.  The ICO remains large, influential and relatively well staffed and resourced. However, the UK’s departure from the EU (Brexit) means that it is no longer an EU GDPR regulator and a European Big Six contender. It is likely to increasingly diverge from the family of EU GDPR regulators. The ICO’s future is still to be decided, over time. DPC Ireland is growing in capability and influence and is one to watch in the next 2-5 years. The EDPB’s Dispute Resolution mechanism is being used by the Big Six regulators, and others, to internally challenge DPC Ireland’s draft decisions to expand its GDPR analysis and the size of its GDPR fines and penalties.

Other key data protection regulators to watch are Denmark, Poland,  Austria and Norway. For the future, significant enforcement action can come from any of the EU’s  27 GDPR regulators, at any time, acting alone, acting together or acting in co-ordination with the EDPB. Therefore, EU data protection regulators and their enforcement activities remain a dynamic and fast-moving environment.

For help with EU/EEA/UK GDPR compliance, data protection regulatory investigations, GDPR enforcement support, data breach response, Data Protection Officer (DPO) services, EU Data Protection Representative services and our Legal & Regulatory Support services, contact PrivacySolved:

Telephone:  +353 1 960 9370 (Dublin)

Telephone:  +44 (0) 207 175 9771 (London)

Email: contact@privacysolved.com

PS102022

PrivacySolved x Cyber Security Month

PrivacySolved provides leading expertise in information security strategy, cybersecurity awareness, data protection, data breach planning and data breach response. October is European Cybersecurity Month #CyberSecMonth. It is also Cybersecurity Awareness Month #CybersecurityAwarenessMonth which is celebrated in North America and all around the world. Listed below is a collection of trusted information security resources, cybersecurity insights and tools to inform, engage and inspire the information security community, businesses and organisations. This information is also very useful for our clients, partners, colleagues and network contacts around the world. We aim to stay connected to drive excellence in information security, cybersecurity and data protection and to respond to emerging security threats and risks.

Cybersecurity Insights

Emerging Cyber Threats: Geopolitics, Deep Fakes and Vishing

The Ransomware Problem: Five Steps to Success

The Ransomware Problem: Board and Leadership Priorities

Ireland’s Cautious Cybersecurity Outlook

Cybersecurity: Focus on the Netherlands’ Information Security Outlook

Cybersecurity and Cyber Resilience in the Fintech Sector

Data Breach Reporting Guidance and Tools

ENISA: Personal Data Breach Notification Tool

Europol and European Cybercrime Centre (EC3)

CISA: Stop Ransomware (USA)

FBI Field Office Cyber Task Forces

Internet Crime Complaint Center (IC3)

National Cyber Security Centre (UK)

Information Commissioner’s Office UK

National Cyber Security Centre (Ireland)

Data Protection Commission Ireland

Essential Online Resources

European Union Agency for Cybersecurity (ENISA)

Cybersecurity and Infrastructure Security Agency (CISA)

US-CERT

Federal Bureau of Investigations (FBI)

Interpol

Australian Cyber Security Centre

Canadian Centre for Cyber Security

National Cyber Security Centre (Netherlands)

National Cyber Security Centre (New Zealand)

The National Cybersecurity Agency of France

Cyber Security Agency of Singapore

The Cyberspace Administration of China (CAC) 中华人民共和国国家互联网信息办公室

BCS, The Chartered Institute for IT

Global Forum on Cyber Expertise (GFCE)

Global Cyber Alliance

SANS Institute

ISACA

(ISC)2

Video

For help, advice, support, information strategy, cybersecurity consulting, policies, procedures and data breach response services, contact PrivacySolved:

Dublin +353 1 960 9370

London +44 207 175 9771

Email: contact@privacysolved.com

Emerging Cyber Threats: Geopolitics, Deep Fakes and Vishing

The information security and cybersecurity threat landscapes are always changing. Threat actors are becoming more sophisticated and threat surfaces are expanding. Exploitable gaps in new technologies are increasing and cybercrime business models are also growing. Added to this mix, are the fragile geopolitical situation in many parts of the world, a tightening of the global economy and simple opportunism. Here are three new and emerging cyber threats to know about, monitor and guard against:

Geopolitics

Geopolitics describes how politics, geography, demography and economics affect foreign policy and the relationships between countries. Recent wars and civil wars, the 2007/2008 global financial crisis, the Covid-19 Coronavirus pandemic, and the effect of climate change has reshaped the competition for global resources and created new political and economic alliances. Some countries are increasingly active in offensive cyberattacks and information security beaches to advance their political and economic goals. Russia, Belarus, North Korea, Iran and China have been identified as countries involved in sophisticated cybersecurity operations, many aimed at critical national infrastructure targets. Most countries have defensive cybersecurity capabilities. Hacktivism has also grown, some political, some environmental and some related to cybercrime and money laundering.

For many companies and organisations, the threat landscape has become complex and sophisticated. There is a need to grow threat intelligence capabilities, monitor key geopolitical events and understand that cyberattacks are not always targeted on their operations specifically, but often cause knock-on effects. Cyberattacks can affect businesses and organisations because they are part of a targeted supply chain, are based in a country, use certain IT services, supply critical infrastructure services or trade with certain foreign states.

Deep Fakes

A deep fake is media such as images, videos, or audio recordings that have been recreated or altered by manipulating a person’s appearance, actions or voice using artificial intelligence techniques such as deep learning. Some deep fakes have included politicians, business leaders and celebrities saying and doing things they would not normally do. Deep fakes may be used to initiate state-related espionage, cybercrime, sophisticated social engineering or traditional crimes like blackmail and extortion. Companies and organisations should educate themselves, develop techniques to spot deep fakes and identify the most likely sources. They should also use technologies to limit its impact and report incidents to National Cyber Security bodies or CERTS, so that trends can be monitored and high-level responses and best practices can be developed.

Vishing (Voice Phishing)

Vishing uses fraudulent phone numbers, voice-altering software, text messages, and social engineering to trick users into divulging sensitive information. Vishing generally uses voice data to trick users. Smishing, is a related form of phishing that uses SMS text messages to trick users. This smishing technique can be used alongside voice calls, depending on the attacker’s preferences and objectives. Businesses and organisations should control and monitor the use of voice data, especially among senior officials. Public disclosures of voice data should be kept to a minimum and 2-factor authentication techniques should be used to avoid impersonation, social engineering, fraud and identity theft.

Conclusion

Chief Information Officers (CIOs), Chief Information Security Officers (CISOs), Data Protection Officers (DPOs), Chief Privacy Officers (CPOs), Chief Data Officers (CDOs) and Senior Leaders should ensure that they receive detailed and diverse sources of threat intelligence. They should aim to understand evolving analysis and attack-pattern information that they receive. Leaders should try to share information with trusted parties and partners to build resilience and reduce the risks and impacts of cyberattacks. Businesses and organisations should update their cybersecurity insurance policies to make sure that they are sufficiently covered for new and emerging cyberattacks. Above all, leaders should be continuously learning and display high levels of curiosity and analysis.

For help, advice, consulting and strategy support services, data protection reviews, GDPR gap analysis, cybersecurity policies and procedures and access to our data breach response services, contact PrivacySolved:

London +44 207 175 9771

Dublin +353 1 960 9370

Email: contact@privacysolved.com

PS082022

techAnalysis: Why Data Scraping and Using Online Images of People for Facial Recognition can breach Data Protection Laws

When a person publishes their image online, many might think that the public image can be widely re-used by others, for new and unrelated purposes. It is true that there are very few privacy and confidentiality rights that protect these published images. Intellectual property rights, such as copyright, may sometimes be relevant. However, other important rights such as contractual rights, fair use rights and data protection rights must always be fully considered.

Clearview AI Inc has been fined by data protection regulators in the UK, Greece, Italy, France and Australia for misusing images and key technologies. In the USA, it has faced class action lawsuits.  This techAnalysis examines data scraping, web scraping, facial recognition technology, artificial intelligence and the complexities of re-using the online images of individuals.

Data Scraping, Facial Recognition Technology and Artificial Intelligence

Data scraping is the process of allowing a computer programme to extract data from the output generated from another programme. Web scraping is a popular form of data scraping in which a computer application is used to extract valuable information from a website, including copying the images of individuals.

Facial Recognition Technologies are technical methods used to identify an individual from a digital image. These technologies rely on personal data and biometric data to identify individuals. In the EU and UK, the General Data Protection Regulation (GDPR) defines biometric data as personal data relating to the physical, physiological or behavioural characteristics of a person that is used to confirm their identify.  Biometric data is included in the list of special categories of personal data in the GDPR. These are some of the most sensitive forms of personal data. The collection and use of these data are high risk processing and requires extra care, attention and often, explicit consent. Other special categories data include race or ethnic origin, political opinion, religious or philosophical beliefs, trade union membership, genetic data, health, sex life and sexual orientation. Data about criminal convictions and offences also attract similar special treatment.   

Artificial Intelligence (AI) is the ability of a computer or computer-controlled robot to perform tasks and analysis in ways that are like those carried out by intelligent human beings. AI includes several techniques such as machine learning and deep learning. AI is often applied to achieve a variety of outcomes including problem-solving, reasoning, knowledge representation, natural language processing, learning, planning, perception, motion and manipulation, social intelligence and general intelligence.

The Story of Clearview AI

Clearview AI is an identity intelligence solutions company that boasts about the superiority of the accuracy and reliability of its facial recognition technology, which is powered by artificial intelligence.  The company’s customers include the police, banks, transportation and governments.  Clearview’s customers could upload a person’s image to the company’s application interface which then checks for a match against the millions of images in the database. In order to provide its services, the company collected more than 20 billion images of people and data from publicly available information from the internet and social media platforms globally for its online facial recognition database. This was done without the knowledge or consent of the individuals or the companies that published the facial images online.    

Clearview AI have been fined £7.4 million (€8.75 million) by the UK Information Commissioner’s Office and £16.91 million (€20 million) by both the Greek and Italian data protection regulators for using images of people on its online database in breach of data protection laws.  The company has also been ordered to stop collecting and using the personal data it had unlawfully gathered and to delete this information from its systems. Clearview AI breached various laws around the world. Large technology companies and social media businesses have started to investigate these practices and take legal action against companies that scrape their data and copy their online information. Often these practices are in breach of the target business’ terms and conditions and fair use policies.

Five reasons why collecting and using images and data collected online breaches data protection laws

  1. Failure to collect and use personal data in a fair and transparent way

Data protection laws require the collection and use of personal data to be fair and processed in the ways that individuals expect. The use of personal data should not lead to unjustified and adverse effects on individuals. It is important to consider lawfulness and fairness of personal data use before data processing starts. Transparent data collection and use requires clarity, openness and honesty to the individuals involved and to ensure that they are properly informed, and where necessary, give their explicit consent.

2. Failure to have a lawful reason for collecting people’s online personal data

It is very important that those who collect and use personal data know and communicate the legal reason for processing data. Gaining the consent of users or those affected is one way of legally processing a person’s information, but there are other acceptable legal routes for data collection, such as:

  • Collecting or using personal data to fulfil a contract;
  • Collecting or using personal data to fulfil a legal obligation;
  • Collecting or processing personal data for public interest tasks or an official function;
  • Collecting or processing personal data for a legitimate personal or business interest or the interests of a third party;
  • Collecting or processing personal data to protect life or a vital interest

3. Failure to have a process in place so that information is not held indefinitely

If there are no processes in place to establish the length of time for retaining personal data, a data protection regulator could find a breach of data protection law. Data retention is important. Personal data should not be kept for longer than necessary.

4. Failure to meet the higher data protection standards for biometric data

When collecting biometric data, or any other form of special categories personal data or sensitive information, all parties must ensure that they meet the higher standards for processing these data. Collecting and using these data is called high risk processing because the potential harm to individuals affected by data misuse could be substantial and severe.

5. Making the process hard for those who wish to object to their information and images being used

If a person wishes to find out whether their image is being used or stored, they should have access to a user-friendly and accessible process. Individuals should be allowed to exercise their data protection rights at any time and at little or no expense.

Advisory: Collecting and Using Online Images of Individuals

There are many issues to consider before collecting images of people from the internet:

  • Ensure that there is compliance with the data protection principles in the EU and UK or similar data privacy legal requirements around the world. A clearly identified  lawful basis for data collection should be one of the first steps. This includes ensuring that all data extraction or copying is in line with the website or platform’s terms and conditions. One solution could be to get permission form the website owner. Though, individuals may still object to the copying and use of their image. Objections by individuals affected should be fully considered and actioned. 
  • Users affected should be properly informed about how their personal data will be used and allow them to exercise their right to access, rectify or delete the information, as necessary.
  • Working with a Data Protection Officer (DPO) or Data Protection Adviser to complete a Data Protection Impact Assessment (DPIA) or Privacy Impact Assessment (PIA) is crucial.  All parties should apply Privacy and Data Protection by Design techniques to reduce data protection risks. If the DPIA identifies risks that cannot be resolved, then businesses and organisations may need to consult with their data protection regulators, before starting to collect images from the internet.

Conclusion

Care and attention are needed to collect and use images from the internet for any new purpose and especially for facial recognition and artificial intelligence activities.  Full legal awareness, proper processes and procedures are very important, or regulators could impose fines and order data to be deleted. This would reduce trust, limit business opportunities, curb innovation, be costly and severely damage reputations.

This techAnalysis is produced in association with Johnson May.

PrivacySolved has years of expertise in UK, EU and global data protection and has worked with the key regulators. We also advise on new technology and artificial intelligence compliance. For advice, support, projects and programmes, contact PrivacySolved:

Telephone:  +44 (0) 207 175 9771 (London)

Telephone:  +353 1 960 9370 (Dublin)

Email: contact@privacysolved.com

PS082022

The Future of UK Data Protection: Less, More, Complex and Uncertain

Briefing

The UK’s future data protection framework and laws are likely to significantly differ from the European Union’s General Data Protection Regulation (GDPR). The changes set out in the UK’s Data Protection and Digital Information Bill, published in July 2022, are a mixture of significant legal changes and superficial adjustments. In other places, long established legal concepts have been renamed and redefined so that past and future EU legal and regulatory interpretation can no longer influence the emerging UK data protection regime.  These updated definitions and new concepts will allow UK regulators and UK courts to interpret and develop these laws and rules, in ways that are more UK-centric. The UK’s exit from the European Union (Brexit) automatically ended UK residents’ specific right to data protection set out in the EU Charter of Fundamental Rights. The legal fact of Brexit narrowed the scope of data protection in the UK, by default, and detaches it from the EU institutions, courts, systems and mechanisms that have previously operationalised data protection. There are also plans in the UK to narrow the scope of the UK’s Human Right Act 1998. This will further limit UK data protection. The UK is left with the UK Data Protection Act 2018, a truncated UK GDPR and a complex web of other laws to synthesize and interpret. These are all derivative laws, which together are more complex than the EU legal framework yet retain key unifying elements. UK data protection is now less stable. New uncertainties abound and a period of re-learning will begin. It is unclear whether the UK will retain EU data protection adequacy, over time.

Headline Changes

The definition of Personal Data has been narrowed. The new definition splits the link between personal data that can identify an individual directly and indirectly. The legal test for identifiability has also been restricted. This means that the scope and reach of UK data protection is more limited for individuals, controllers and processors.  While the new definition may appear technical, it will have practical effects on digital data, databases, cloud services, security strategies and risk profiles. The change in the law also automatically creates new pools of non-personal data, which fall outside the scope and reach of UK data protection.

The Purpose Limitation Principle has been expanded with legal tests to judge compatibility with new personal data uses. There are also new rules for assessing that secondary uses are compatible with original purposes. This creates new pathways for personal data re-use and secondary uses.

The Legal Bases for Processing Personal data have been broadened. Legitimate interest has been given a new prominence. A new list of data processing activities that automatically meet the legitimate interest balancing test has been introduced. This includes crime prevention, safeguarding the vulnerable, emergencies and democratic engagement. These new rules will encourage data sharing, especially by the government and the public services. The new rules also limit the scope for objection or refusal.

The Information Commissioner’s Office (ICO), the UK’s Data Protection Regulator, will be abolished in its current form. This reform appears to be an attempt to remove the UK regulator from the orbit, influence and its history as part of the European Data Protection Board (EDPB). The Commission will come under more direct UK government control and supervision. The Commission will be less independent.  The Commission will have two distinct additional powers. The first, is to require a controller or processor to prepare a report at their own expense. The second, is an Interview Notice, requiring a person to attend a place to answer questions. 

UK International Data Transfers have been removed from the EU GDPR framework. The EU’s restrictive data transfer default position has been replaced by a slightly more permissive UK approach. Data transfers can now proceed via UK Adequacy Regulations, UK Standard Contractual Clauses (SCCs), UK Binding Corporate Rules (BCRs) or UK Derogations for Special Situations. A new Data Protection Test has been introduced to guide the evaluation of UK data protection adequacy and the UK data protection equivalence of third-party countries.

Data Subject Rights have become more complicated and restrictive than in the GDPR. Requests can be refused if Controllers decide that these are vexatious or excessive. This means requests made in bad faith, those intended to cause distress and those which are an abuse of process. Requests must be answered within 30 days, but at any time during this period the controller can extend the response time by a further two months (around 60 days) because of the complexity of the request or the number of requests. The data subject notice rules in GDPR Articles 13 and 14 have been restricted. No notice is required for collecting personal data for further processing (and re-use) for scientific or historical research, archiving in the public interest or statistical purposes, with appropriate safeguards and not if providing that information is impossible or would be a disproportionate effort.

A definition of Direct Marketing will be added to UK law in the Privacy and Electronic Communications (EC Directive) Regulations 2003 (S.I. 2003/2426), which is called UK PECR.  Direct marketing means “the communication (by whatever means) of advertising or marketing material which is directed to particular individuals.” The scope for using cookies without consent has increased and the definition of strictly necessary cookies has also been widened. New opt-outs have been introduced for unreceived messages and direct marking for democratic engagement.   There is a new duty to inform the regulator about unlawful direct marketing. UK PECR penalties have been increased.

A More Limited UK Data Protection Governance System

The Information Commissioners Office will be abolished, and a new organisation called the Information Commission will take its place and replicate most of its existing powers. The Information Commission will be more dependent on the involvement of a UK Government Secretary of State for objectives and direction. The Commission will be expected to do more reporting and outreach. The Commission will have a duty to encourage economic growth and innovation. The Commission will be given new powers to refuse to act on certain complaints such as those that have been made prematurely or are vexatious or excessive.

The legal duty to appoint a Data Protection Officer (DPO) has been removed. The role of Senior Responsible Individual (SRI) has been created for public bodies and those that carry out high risk data processing. There is no legal duty for the SRI to be independent, instead the organisation can direct and give instructions to the SRI about their work. The SRI must be a member of senior management.

The legal duty for foreign-based organisations to appoint UK Data Protection Representatives has been removed. The Information Commissioner and individual data subjects based in the UK will not have a formal legal route to engage with foreign-based companies that offer goods and services and target or monitor UK individuals.

The legal duty to have a Register of Processing Activities (ROPA) has been retained but it has been renamed Records of Processing of Personal Data. The contents of these Records are similar and serve a similar function. The new Register requirement does not apply to data controllers or processors that employ less than 250 individuals unless they carry out data processing that is likely to result in a high risk to the rights and freedoms of individuals.

Data Protection Impact Assessments (DPIAs) have been removed and renamed Assessments of High Risk Processing. The scope of the new Assessment is more limited and the Senior Responsible Individual’s (SRI) direct involvement is not legally required.   

The Office of the Commissioner for the Retention and Use of Biometric Material will be abolished, and its powers transferred to the Investigatory Powers Commissioner. The Office of Surveillance Camera Commissioner will also be abolished. The functions of the National DNA Database Strategy Board will be transferred to a new Forensic Information Database Strategy Board.

Changes to UK Privacy and Electronic Regulations (UK PECR)

UK PECR has been amended to allow a range of new exceptions to the historical restrictions placed on cookies and similar technologies storing information, or gaining access to information stored, in the terminal equipment of a subscriber or user. This means that there will be a greater scope to use and deploy cookies, web beacons and similar technologies in the UK. It is unclear how this will work in practice, especially for website services that target the UK, EU/EEA and the rest of the world. However, these legal provisions may lead to novel technical solutions and innovations.

New Ideas to Support Online Identification and Innovation

The proposed law contains new provisions to make Digital Verification Services (DVS) more reliable by initiating a trust framework, a register, an information gateway and a trust mark. UK Government Secretaries of State or the organisations they nominate will have new powers to request access to information secured by DVS. New definitions of business data, customer data, data holders, decision-makers, enforcers have been introduced. The new rules state that the UK Government will have power to regulate these actors and their activities. The new rules also include powers to encourage information technology that enables consent to be given, or to allow automatic objections.

The new law recognises European Union conformity assessment bodies under the EU eIDAS Regulation (trust services) and other overseas trust products and services.

The Future

The UK’s Data Protection and Digital Information Bill is a mixed picture. There is an attempt at data protection de-regulation. UK GDPR will be narrower in key areas, including the long-established definition of personal data. Importantly, UK data protection governance structures have been significantly scaled back, notably the new rules governing the Information Commissioner’s Office, Data Protection Officers and UK Data Protection Representatives. However, some of the new rules appear to be market-making for new technologies. Many of the legal changes substantially benefit the UK government, public services data sharing and their service providers. Nine senior Ministers have sponsored and support the new law. The sponsoring Secretaries of State has reserved sweeping and controlling powers to themselves. Companies and organisations will find that UK data protection is much more complex than EU GDPR, for what is a much smaller market. Further, UK data protection law can now change at any time in the future through easy to adopt regulations and direct government interventions.

PrivacySolved has years of expertise in UK, EU and global data protection and work with the key regulators. For advice, support, projects and programmes, contact PrivacySolved:

Telephone:  +44 (0) 207 175 9771 (London)

Telephone:  +353 1 960 9370 (Dublin)

Email: contact@privacysolved.com

PS072022

1 2 3 7