Five Key Things to Know about Dubai DIFC Data Protection Law 2020

The Dubai International Financial Centre (DIFC) Data Protection Law 2020 (DP Law) applies to the DIFC financial services free zone in Dubai, United Arab Emirates and took effect on 1 July 2020. The DIFC DP Law protects the personal data held and processed by organisations that are registered in the DIFC as well as linked external organisations. New data protection rights include the right to access personal data, the right to data portability, the right to withdraw consent, the right to object to automated decisions (including profiling) and the right not to suffer discrimination for exercising data protection rights. Businesses have an overriding duty to demonstrate compliance with the data protection principles. The DIFC Commissioner of Data Protection is the regulator. Regulator enforcement starts on 1 October 2020.

1.What types or organisations are covered by DIFC DP Law?

The law applies to businesses that are registered in the DIFC or businesses that process personal data in the DIFC as part of stable arrangements. Businesses that process data on behalf of these organisations, such as their suppliers, are also covered by the law.

2. What types of data or information are covered by DIFC DP Law?

The DIFC DP Law protects personal data which is defined as information that identifies or makes living individuals identifiable. Identified or identifiable means reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors about an individual’s biological, physical, biometric, physiological, mental, genetic, economic, cultural or social identity.

3.What are the main DIFC DP Law obligations for businesses?

Businesses must:

  1. Comply with additional data protection principles of accountability (demonstrate compliance), transparency and process personal data in line with the rights of individuals.
  2. Appoint a Data Protection Officer (DPO), if they are DIFC bodies or carry out high risk processing on a systematic or regular basis. Other controllers or processors may appoint DPOs.
  3. Report data breaches as soon as practicable in the circumstances to the DIFC Commissioner of Data Protection and to individuals affected (if the breach is a high risk to security or individual rights).
  4. Register with the regulator and publish detailed data protection notices.
  5. Complete Data Protection Impact Assessments (DPIAs) for high risk data processing.

4. If businesses comply with the European Union’s General Data Protection Regulation (GDPR), will they automatically comply with DIFC DP Law?

Yes, in large part, but not completely. GDPR and DIFC DP Law have different scopes, definitions, special provisions and compliance requirements. However, there are important similarities. DIFC DP Law was enacted to include provisions that largely mirror GDPR. It is likely that the DIFC will make an application to the European Union (EU) for an adequacy decision to ease international data transfers between the DIFC and the EU. GDPR data mapping and records of processing activity logs can help to identify DIFC DP Law impacted personal data. GDPR Privacy Notices, policies and GDPR processes used to respond to GDPR rights can assist DIFC DP Law compliance, but these must be tailored. Data processing agreements and online notices must be specifically updated.

5. Does the DIFC DP Law apply to foreign based companies and what are the penalties for breach of the law?

Yes, it can. If foreign businesses process personal data and are registered in DIFC or process personal data in the DIFC as part of stable arrangements in the DIFC, then the DIFC DP Law will apply. The law also applies to businesses that process data on behalf of organisations registered in the DIFC or for organisations that process data in the DIFC as part of stable arrangements. The DIFC Commissioner for Data Protection can impose administrative fines of up to $100,000. DIFC Courts can order businesses to pay compensation to individuals.

Automated Decisions, Algorithms, Profiling and AI: EU Data Protection Lessons

Article

Companies and organisations should ensure that their data protection compliance is not reduced to a set of policies and procedures, quarterly reports and annual reviews. Data protection outcomes should not be synonymous with the introduction of enterprise privacy software, compliance team updates of controls or data privacy as intractable legal and IT add-ons to be overcome. Effective data protection should be dynamic and integral to day to day activities, in the way that workplace health and safety, financial probity and corporate good conduct flows through organisations, affecting almost every decision. Data protection should not play catch-up to digital transformation initiatives, IT strategy changes, research and development priorities or expansions of the supply chain. Data protection principles should be applied consciously to strengthen an organisation’s core DNA and operating model. As a result, whenever personal data are collected, stored or used, data protection should become a byword for responsible data management, excellent data ethics, protecting individual personal data, accountability, security, resilience, profitability, trust and innovation.

Data Protection by Design and by Default

In the same way that financial transparency, environmental impacts and board accountability are key measures for listed companies, data protection should be designed into an organisation’s way of doing business, so that it becomes second nature. The EU’s General Data Protection Regulation (GDPR) has increased the prominence and status of Data Protection by Design, Security by Design and Privacy by Design (PbD) practices. The data protection principles of transparency, accountability and data minimisation are crucial. Data Protection Impact Assessment (DPIA) is a practical tool to practice high level data governance, demonstrate compliance and add vital data intelligence to an organisation’s knowledge base. Data Protection should  be operationalised, at the beginning of decision-making processes and information life cycles to maximise the planned outcomes.  Poor data governance should be considered as problematic as poor workplace health and safety, poorly trained staff and financial mismanagement.

Automated Decisions

Automated decisions are assessments, judgements, results and outcomes made by computers without human intervention. These decisions are often made by computer calculations and the outcomes are not checked or verified by humans. These results can have serious economic, financial, political and social implications for individuals. Companies and organisations may carry out automated decisions without full awareness or assessment of its impact or that specific data protection rules apply. The outsourcing of Human Resources functions and other business processes have redirected some automated decisions away from organisations’ direct internal management structures, creating greater risks. However, legal responsibilities and liabilities remain with the organisation that act as the personal data controller.  Automated decisions can be based on assumptions about a person’s skills, decisions, actions, intentions or characteristics. Assumptions can be wrong, out of date or incomplete and cause discrimination, financial loss, loss of opportunity, distress or other damage.  Companies and organisations should be transparent about assumptions made by automated decisions and apply internal quality checks, testing and outcome verification. Individuals affected should also be provided with a way to intervene into the decision-making processes, request human involvement, express their views or question the outcome. 

Algorithms and Strategy

An algorithm is a sequence of defined, computer-implementable instructions, used to solve a type of problem or to perform a computation. Algorithms are present where computers operate. As a result of the exponential growth of computing power, the enormous increase of data and the rise of artificial intelligence, the role of algorithms has become more prominent in everyday business and how organisations operate. As a result, companies and organisations should ensure that they have a clear strategy for the use of algorithms that affect individuals. The strategy should sit with overall business strategies for growth, efficiency, profits and innovation. All strategic outcomes should be quality tested against how they protect individual’s personal data, promote information security (and cybersecurity), encourage data transparency, create data accountability and data fairness (quality and data minimisation).

Profiling

The rise of information technology, online transactions, social media and internet usage around the world have created an explosion of profiling. Companies and organisations may carry out profiling without full awareness or assessment of its impact or that specific data protection rules apply to the practice. Profiling is the use of mathematical formulas, computations or algorithms to categorize individuals into one or more classes or groups. Profiling can also be used to evaluate individual characteristics such as performance at work, economic standing, health, personal preferences, interests, reliability (skill or competence), behaviour, location, movement, intention or priorities. The most intrusive elements of profiling can be the ability to infer information from data and the ability to predict an individual’s future choices or actions. Inferences and predictions can be wrong, biased, incomplete and based on irrelevant data, yet have a substantial effect on individuals, including discrimination, financial loss, loss of opportunities, distress or other damage.  Companies and organisations must be transparent about their use of profiling, have internal quality checks, practice data minimisation and verification. Individuals affected must be able to seek information about their profiles and question the decisions made about them.

The GDPR has one of the most sophisticated regulatory frameworks to deal with profiling and automated decision making. In most cases, automated decision making is categorised as profiling. EU policy makers anticipated the growth of profiling by ensuring that all foreign companies (with or without an EU presence), that profile EU citizens’ behaviour in the EU, fall within the scope of GDPR, even where the profiling operations take place outside the EU. This may not be well understood and may often be ignored by organisations. As well as compliance with the GDPR’s main principles and provisions, profiling should always be accompanied by Data Protection Impact Assessments (DPIAs). These DPIAs must also comply with the requirements of the relevant EU member states’ data protection regulator and local laws. Consulting with the individuals affected and with the data protection regulator could also be required, based on the nature of the profiling.  The Data Protection Officer should support and drive the process of producing high quality DPIAs that are well written, honest, easy to understand, effectively reviewed and updated.

Artificial Intelligence

Artificial Intelligence (AI) is the ability for computer systems or computer-controlled robots to perform tasks normally requiring human intelligence, such as visual perception, speech recognition, decision-making, translation between languages, performing manual tasks, interactions with other computers and interactions with humans. AI is big business and is set to transform the global economy, work, home, education, healthcare and security. The global artificial intelligence market size is expected to reach $390.9 billion US dollars by 2025, according to a report by Grand View Research, Inc. The market is anticipated to expand at a Compound Annual Growth Rate (CAGR) of 46.2% from 2019 to 2025. Companies and organisations should ensure that in building AI systems that algorithms are tested, reviewed and outputs verified. Data sources should be quality checked to remove incomplete data, bias and out of date information. Assumptions and inferences should be robustly tested. These steps are data hygiene and reflect similar GDPR requirements. However, GDPR compliance and relevant data protection and privacy laws should be specifically incorporated into AI data life cycles.

Companies and organisations should ensure that AI is explainable so that individuals affected can increase their understanding and trust can be built. This requirement maps across to the GDPR’s principles of fairness, lawfulness, transparency, purpose limitation, accuracy, integrity, confidentiality and accountability. Frameworks have been published to help organisations manage and explain AI to improve accountability. The European Union High-Level Expert Group on AI has published Ethics Guidelines for Trustworthy Artificial Intelligence. The United States National Institute of Standards and Technology (NIST) has published Four Principles for Explainable Artificial Intelligence. The UK’s data protection regulator, the Information Commissioner’s Office and the Alan Turing Institute, have published joint guidance on Explaining Decisions Made with AI.

Data Protection Lessons

Data Protection maturity can improve companies and organisations key strategic goals of profitability, growth, efficiency, trust, innovation and resilience. Organisations that attempt to grow without robust data protection find that several of their key strategic goals remain uncertain. Their longevity can be at risk because users, customers and supply chain trust are low. Their efficiency and growth are precarious because at any time, a data protection regulator, markets regulator, privacy activists, civil society groups, governments and individuals could start campaigns against their poor data protection practices. Fines, bad publicity, internal staff protests, political interjections and whistleblowers can create a state of inherent instability. Excellence in data protection and data protection by design should be positive and proactive advancements rather than reactive responses. For the future, agility and trust will be important economic drivers. Organisations that understand their data and personal data, explain their data uses, imbed data protection by design and engage with stakeholders about data governance issues will thrive, remain resilient and fulfil their key strategic objectives.

PS082020

Schrems II: Rethinking Privacy Shield & Standard Contractual Clauses

Briefing

On 16 July 2020, the European Union’s highest court, the Court of Justice of the European Union (CJEU) delivered the much anticipated decision in the Max Schrems Case (Schrems 2). The court was asked by Ireland’s High Court to decide on key mechanisms for international transfers of personal data from the EU to the United States. The underlying cases arose out of Austrian privacy activist Max Schrems’ complaint against Facebook and Ireland’s Data Protection Commission over interpretation of key data protection provisions. Max Schrems objected to US surveillance of foreign nationals which conflicted with the General Data Protection Regulation (GDPR). The court decided that US surveillance laws and practices stand in opposition to the GDPR’s fundamental human rights protection of EU citizens. As a result, personal data transfers are non-compliant to EU law and need special attention, assessment, reviews and additional safeguards to make these compliant. The case has been called constitutional and cannot be appealed.

Privacy Shield

The Court of Justice of the European Union found that the EU/US Privacy Shield data protection adequacy decision agreed in 2016 is invalid. Personal data transfers based on this mechanism must cease.  EU citizens have no real judicial remedy or equivalent protections in the US under Privacy Shield. The Swiss/US Privacy Shield remains in force but the Swiss Data Protection Authority is reviewing its position. Privacy Shield continues to operate internally in the USA based on federal enforcement mechanisms, US laws and the role of domestic regulators.

Standard Contractual Clauses (SCCs)

The European Commission’s Data Protection Standard Contractual Clauses remain lawful and enforceable. However, the court has insisted that Data Exporters (in the EU) and Data Importers (in foreign countries) must carry out more detailed checks to ensure that foreign laws and data governance rules are compatible with GDPR. Data Importers must inform Data Exporters if they are unable to comply with EU data protection law. Data Exporters must refuse to transfer personal data where specific personal data transfers are incompatible. EU Data Protection Authorities are also encouraged to intervene and review Standard Contractual Clauses and be prepared to withhold or withdraw authorisations for international personal data transfers. On 4 June 2021, the European Commission published its final updated Standard Contractual Clauses that comply with GDPR and the Schrems 2 case. On 21 March 2022, the UK published its new international data transfer regime.

Responses and Actions

  1. Companies and organisations should assess their exposure to Privacy Shield, work towards stopping these personal data transfers and investigate substitute arrangements. There is no grace period for compliance.
  2. Wait for and act on concrete guidance from each relevant EU Member State’s Data Protection Authority, the European Data Protection Board (EDPB) and the European Commission.
  3. Wait for the European Commission’s new GDPR-approved Standard Contractual Clauses (June 2021) and implement these by December 2022.
  4. Begin to review high value and high risk contracts that contain Standard Contractual Clauses (SCCs) that allow transfers to the USA.
  5. Review Binding Corporate Rules (BCRs) to see if personal data transfer protections from the EU to the USA need to be strengthened or varied.

Resources

EU / US and Swiss / US Privacy Shield Home Page

Schrems II Case Press Release

Schrems II Case Full Judgment

Schrems II European Data Protection Board (EDPB) Frequently Asked Questions

Schrems II US Federal Trade Commission (FTC) Statement

Schrems II US Secretary of Commerce Statement

Schrems II Joint Statement from European Commission and US Department of Commerce

Schrems II Ireland Data Protection Commission (DPC) First Statement

Schrems II UK Data Protection Commissioner’s Office (ICO) First Statement and Updated Statement

Schrems II European Data Protection Board (EDPB) Taskforce on Post-Schrems II Complaints

Schrems II US Department of Commerce, US Justice Department & US Office of the Director of National Intelligence White Paper on US Privacy Safeguards for SCCs and other Legal Bases

Schrems II European Data Protection Supervisor (EDPS) Strategy for EU Institutions to comply with Schrems 2 Ruling

Schrems II European Data Protection Board (EDPB) Supplementary Measures for data transfer tools to ensure GDPR compliance – Consultation

Schrems II European Commission Standard Contractual Clauses (SCCs) 2020 – Consultation  

Schrems II European Data Protection Board (EDPB) and European Data Protection Supervisor (EDPS) Joint Opinion 2/2021 on Standard Contractual Clauses for the Transfer of Personal Data to Third Countries

European Commission Final Standard Contractual Clauses (SCCs) for Data Controllers and Data Processors and also International Data Transfers – June 2021

UK Information Commissioner’s Office (ICO) Consultation on UK International Data Transfers and UK Standard Contractual Clauses – August 2021

UK Information Commissioner’s Office (ICO) Response to DCMS Consultation “Data: A New Direction” – October 2021

UK GDPR Final International Personal Data Transfers Scheme and Documents – March 2022

European Commission announcement of an EU/US Trans-Atlantic Data Privacy Framework Agreement in Principle – March 2022

White House Briefing Room announcement of an EU/US Trans-Atlantic Data Privacy Framework Agreement in Principle and FactSheet – March 2022

European Commission Questions and Answers (Q&As) for the two sets of EU 2021 Data Protection Standard Contractual Clauses – May 2022

For Further Assistance, contact PrivacySolved:

Telephone (London): +44 207 175 9771

Telephone (Dublin): +353 1 960 9370

Email: contact@privacysolved.com

GDPR in 2020 and in the Future: Views from Brussels

Briefing

The European Commission’s General Data Protection Regulation (GDPR) Evaluation Report of June 2020, declares the GDPR a success. However, it concedes that there is still more work to do. The EU is proud that the law is now a reference point and a catalyst for many countries around the world to modernise their data protection rules. Businesses, including SMEs, can comply with unified rules on a more level playing field. The general level of GDPR awareness among European citizens stands at between 69% and 71%. Conversely, 30% of EU citizens are not sufficiently engaged with data protection.  This is a concern in an increasingly data-driven and artificial intelligence led future.  The EU boasts that GDPR is future-proof and provides important and flexible tools to ensure data protection / privacy by design and security by design as new technologies develop.

The Challenges

Since May 2018, there have been challenges to the uniform application of GDPR at EU level and in each EU country:

  • Between May 2018 and November 2019, 22 EU/EEA GDPR regulators issued 785 fines. However, most fines have been relatively modest and were mainly issued against the public sector and small companies.
  • The handling of cross-border cases has not been as efficient or cohesive as intended.  Differences persists in national administrative and court procedures, varying interpretations of key GDPR concepts and how and when to activate cooperation procedures.
  • Slovenia has not yet enacted new GDPR laws or updated older data protection laws and so is a weak link in EU-wide compliance.
  • Ireland and Luxembourg which hosts large global company headquarters have not received sufficient national funding and resources to meet their significant GDPR regulatory responsibilities.
  • The EU’s GDPR regulators acting as the European Data Protection Board (EDPB) mutually assist each other, but the consistency mechanism’s key dispute resolution and urgency procedures have not yet been used.

Priorities and Actions

EU institutions, GDPR regulators and national governments have been tasked with the following actions:

  • National governments should ensure that national laws and sector rules, are fully in line with the GDPR.
  • National governments should provide GDPR regulators with the necessary human, financial and technical resources to properly enforce the data protection rules and liaise with stakeholders, citizens and SMEs.
  • GDPR regulators should develop efficient working arrangements and increase the functioning of the cooperation and consistency mechanisms.
  • GDPR regulators should closely monitor how GDPR applies to new technologies such as Artificial Intelligence, Internet of Things, Blockchain, scientific research and other technologies and the EDPB will issue guidance on these topics.
  • The European Commission should continue to promote the convergence of data protection rules to ensure safe international data flows. This could include new or updated data protection laws or adopting the Data Free Flow with Trust (DFFT) concept internationally.
  • The European Commission should continue data protection adequacy discussions with non EU/EEA third-countries.
  • The European Commission will modernise and expand international data transfer mechanisms by updating the EU’s data protection Standard Contractual Clauses (SCCs) and certification mechanisms.
  • The EDPB will clarify the procedural steps to improve cooperation between the lead data protection authority and the other GDPR regulators involved in shared activities.
  • The EDPB will streamline the assessment and approval processes for Binding Corporate Rules (BCRs) to speed up the process.
  • The EDPB will complete work on the architecture, procedures and assessment criteria for codes of conduct and certification mechanisms as tools for international data transfers.

The Future

The EU believes that the GDPR’s future-proof and technology-neutral approach was tested by the Coronavirus Covid-19 pandemic and has proven to be successful. GDPR principles provided a useful framework to support the development of tools to combat and monitor the spread of the virus. This future-proof and risk-based approach will apply to the EU’s framework for Artificial Intelligence and the European Data Strategy. The overall aim is that GDPR becomes fully incorporated into the EU’s digital policy, data governance, data ethics, digital transformation, cybersecurity and pandemic recovery plans and initiatives. The EU’s strategy is also international, including engagement with African and Asian partners and inter-governmental bodies to promote regulatory convergence and support capacity-building within data protection regulators globally. There is also a plan to promote greater international enforcement cooperation between data privacy regulators, including signing cooperation and mutual assistance agreements.

Navigating Brexit Data Protection Uncertainty, Risks, and Options

Article

The UK’s departure from the EU on 31 January 2020 (‘Brexit’) changes the EU/UK data governance landscape. The agreed transition period1 until 31 December 2020 offers a period of EU/UK data protection continuity2 and ‘business as usual.’ In the longer term, however, there is uncertainty about EU to UK personal data flows, UK data protection law, and General Data Protection Regulation (Regulation (EU) 2016/679) (‘GDPR’) compliance. EU-based, European Economic Area (‘EEA’) based, and international businesses face a series of challenges when seeking to understand and fully predict the UK’s data protection future. Wayne Cleghorn, CEO of PrivacySolved, explores these uncertainties, risks, and options to shed light and offer guidance on priorities and actions.

Mind the gap: UK data protection and EU GDPR future

EU, EEA, and international businesses and organisations understand that EU data protection laws lay at the heart of EU politics, human rights, economy, and trade. The GDPR seeks to place data protection at the heart of the EU’s single market and the future digital single market while also further elevating the protection of personal data and special categories of data as a fundamental EU right and a broader human right4. The UK’s EU Withdrawal Agreement Act5 removes the UK from this system, by revoking6 key EU treaties from applying to the UK. However, the UK enacted the Data Protection Act 2018 (‘the Act’)7 to anchor the GDPR into UK domestic law. This Act will replace the GDPR after the end of the transition period and offers most of the protections of the GDPR, but without the key functional mechanisms that other EU Member States will rely on. These mechanisms include the role of the European Commission in data protection, European Data Protection Board (‘EDPB’) membership8, the consistency mechanism9, the One Stop Shop10 mechanism, the EU-US Privacy Shield11 (‘the Privacy Shield’), and the data protection decisions of the Court of Justice of the European Union12 (‘CJEU’). Legally and practically, UK data protection divergence begins on 1 February 2020, even within the short transition period. At the end of the transition period, UK data protection risks becoming less aligned with the EU and less automatic. The UK and EU will be on different paths as a result of the post-Brexit status and inertia. This ‘new normal’ creates pockets of uncertainty, risks, opportunities, and options.

Uncertainties and risks

UK adequacy decision

UK, EU, EEA, and international businesses’ personal data flows are best protected and suffer the least disruption if the European Commission issues a post-Brexit ‘adequacy decision13’ that the UK provides an adequate level of data protection comparable to the EU. The UK has a good claim to such an adequacy decision because of its existing GDPR alignment14, but the adequacy process includes wide-ranging investigations and a formal decision of the European Commission in consultation with other EU bodies15. As a result, a decision is unlikely to be made for many months and it may become entangled in the UK/EU free trade agreement negotiations occurring throughout 2020 and beyond.

International data transfers

On exiting the EU and the EEA, after the transition period, without an adequacy decision, the UK becomes a ‘third country’ in terms of data protection16. EU and EEA businesses and organisations, as well as international businesses with EU/EEA operations, need to review and plan in advance for the appropriate safeguards needed to facilitate EU to UK personal data transfers. Standard Contractual Clauses17 (‘SCCs’) are the most common solution, but the data exporter must be in the EU and the data importer outside the EU, so these will not typically facilitate data transfers from the UK to the EU after the transition period. The existing Privacy Shield18 will no longer cover the UK, for UK to US data transfers, and so existing arrangements will need to be adjusted in advance and while a UK version of the Privacy Shield is created. Binding Corporate Rules19 (‘BCRs’) are a stable solution but these cover only intra-group data transfers, but take a long time to prepare and receive approval from EU data protection supervisory authorities. The agreed transition period appears to be too short to begin any substantial BCR applications at the UK Information Commissioner’s Office (‘ICO’). After transition, the ICO will no longer be a GDPR BCR-granting data protection supervisory authority, and so EU and international businesses and organisations need to examine their legal proximity and access to other EU data protection supervisory authorities for their BCR compliance activities. One key post-Brexit transition period challenge will be how EU-based data processors and sub-processors respond to data protection compliance instructions from UK-based data controllers. This scenario20 was never envisaged by the authors of the GDPR. As a result, this situation creates many complications and must be dealt with on a case-by-case basis. Bespoke contracting will be one of the ways to create solutions for these gaps.

The ICO and UK courts

At the time of publication, the ICO21 is one of the largest, most active, and influential data protection authorities in the EU and around the world. During the Brexit transition period, it will continue its GDPR supervisory authority role22, but at a distance and with the disadvantage of no longer being an active decision-making member23 of the EDPB. The ICO’s longer term position in the EU’s structures remains even more uncertain after the Brexit transition period. While the ICO will continue to safeguard UK residents and be the data protection authority for many UK-based businesses, it is unclear whether the ICO will accept and handle GDPR complaints from EU citizens, EU-based, and international data controllers and processors under the GDPR24. Several of the ICO’s key powers come from the GDPR, which has made it an integral member of the EDPB25. However, the ICO has accepted that, in law, it will no longer be a ‘supervisory authority’ for the GDPR after the end of the transition period26, but it will seek to maintain a close relationship with the EDPB. Going forward, the most impactful issue is the likelihood that the ICO will begin to apply data protection legal interpretation primarily from UK courts and not the CJEU or other EU Member States. If this occurs, UK data protection divergence will become entrenched. UK courts have only recently begun to produce high level court decisions on data protection remedies27. Post-Brexit, these courts may retreat to narrower and more UK-centric data protection interpretations and applications.

Options and actions for EU-based, EEA-based, and international businesses and organisations

In the short to medium term, the UK data protection landscape should be regarded as a work in progress, a special case, and a candidate country for an EU adequacy decision. Businesses and organisations should seek continuity where possible, reduce the risks to personal data flow interruption, and preserve UK/EU GDPR alignment as much as possible, especially within the Brexit transition period which runs to December 202028. However, this implementation period is short and there are several matters that require specific early attention, review, and action, by data controllers and data processors outside the UK.

Plan to update data protection notices, data protection policies, contract clauses about the GDPR, and initiate supply chain reviews

Key documents that have not already been reviewed will need be updated to ensure that the impact of the UK’s Brexit on data protection compliance is acknowledged in commercial arrangements. New arrangements may need to be negotiated, agreed and formally updated.

Plan to replace the UK ICO as the GDPR lead supervisory authority, One Stop Shop authority, and BCR approval authority

EU and international businesses and organisations should review their previous analysis of the UK ICO as their lead supervisory authority for the GDPR, their One Stop Shop authority, and the authority to which their BCRs can be submitted and agreed. Alternative EU supervisory authorities should be considered and selected to replace the ICO’s existing role for these activities to properly comply with the GDPR over the longer term. Detailed expert advice may be required to embed these changes. For larger organisations, the transition period could be used to consider and begin to implement any changes.

Appoint an EU representative

During and after Brexit’s transition period, the GDPR will still apply to businesses or organisations that offer goods, services, or monitor EU citizens. Where these businesses and organisations have no establishment of settled presence or stable arrangements in an EU Member State, the business or organisation must appoint an EU representative29 to liaise with the relevant EU supervisory authorities, and deal with individuals who wish to exercise their rights under the GDPR. The UK will no longer be an eligible EU Member State after the transition period. As a result, UK businesses and international businesses and organisations that have GDPR obligations will need to re-direct their GDPR compliance focus to other EU countries. International businesses should also reassess UK-based EU representatives which are currently in place. Care should be taken to negotiate and agree the scope of these appointments. The identities of the relevant instructing data controllers and data processors should be clear. Liability, insurance, and the roles and responsibilities of each party should also be explicitly agreed. It will take time to update internal and external teams, processes, technologies, and training, and so larger and more complex businesses should not wait until the end of the transition period to begin this work.

Focus on international data transfers

International data transfers can be a risky area of GDPR compliance and are subject to change. The CJEU is likely to issue court decisions on SCCs and EU institutions will provide updates on the Privacy Shield and BCRs. Currently approved EU SCCs may be updated to better reflect the GDPR. When these updates occur, the UK’s position will become apparent, especially if EU institutions and courts require changes to be made, which the UK may not be legally obliged to follow. A key test is due in May 2020, when the European Commission will present its first evaluation and review30 of the GDPR to the European Parliament and the Council of the European Union.

Focus on data protection developments in key sectors and the growth of the GDPR codes of practice and certifications

Codes of practice and certification mechanisms are being developed in the EU and UK, and may provide GDPR compliance solutions and options in the medium to longer term. These may, over time, help to bridge the increasing EU/UK data protection divide and reduce the data protection uncertainties created by Brexit.

For Enquiries:

contact@privacysolved.com

London: +44 207 175 9771 \ Dublin: +353 1 960 9370

www.privacysolved.com

References:

1. Articles 126-127 of the EU / UK Consolidated Withdrawal Agreement of 17 October 2019, TF50 (2019) – Commission to EU 27, available at: https://ec.europa.eu/commission/sites/beta-political/files/consolidated_withdrawal_agreement_17-10-2019_1.pdf

2. Article 128 of the EU/UK Consolidated Withdrawal Agreement of 17 October 2019, TF50 (2019) – Commission to EU 27, available at: https://ec.europa.eu/commission/sites/beta-political/files/consolidated_withdrawal_agreement_17-10-2019_1.pdf

3. GDPR, available at https://eur-lex.europa.eu/eli/reg/2016/679/oj

4. GDPR, Recitals 1-8.

5. EU (Withdrawal Agreement) Act 2020, available at: http://www.legislation.gov.uk/ukpga/2020/1/contents/enacted

6. Section 1 of EU (Withdrawal Agreement) Act 2018, available at: http://www.legislation.gov.uk/ukpga/2018/16/contents/enacted

7. UK Data Protection Act 2018, available at: http://www.legislation.gov.uk/ukpga/2018/12/contents/enacted

8. Articles 68-76 and Recitals 139 – 140, GDPR.

9. Articles 63-67 and Recitals 136 – 138, GDPR.

10. Article 56 and Recital 127, GDPR.

11. Available at: https://ec.europa.eu/info/law/law-topic/data-protection/international-dimension-data-protection/eu-us-data-transfers_en#commercial-sector-eu-us-privacy-shield  and https://www.privacyshield.gov/welcome

12. Available at: https://curia.europa.eu/jcms/jcms/j_6/en/

13. Article 45, GDPR, see also: https://ec.europa.eu/info/law/law-topic/data-protection/international-dimension-data-protection/adequacy-decisions_en

14. See: https://publications.parliament.uk/pa/cm201719/cmselect/cmexeu/1317/131702.htm

15. See: https://www.europarl.europa.eu/RegData/etudes/STUD/2018/604976/IPOL_STU(2018)604976_EN.pdf

16. See Speech by EU Chief Negotiator Michel Barnier on 26 May 2018 in Lisbon “..And we cannot, and will not, share this decision-making autonomy with a third country, including a former Member State who does not want to be part of the same legal ecosystem as us” available at: https://ec.europa.eu/commission/presscorner/detail/en/SPEECH_18_3962

17. Article 46, GDPR, see also: https://ec.europa.eu/info/law/law-topic/data-protection/international-dimension-data-protection/standard-contractual-clauses-scc_en

18. Article 45, GDPR, see also: https://ec.europa.eu/info/law/law-topic/data-protection/international-dimension-data-protection/eu-us-data-transfers_en

19. Article 47, GDPR, see also: https://ec.europa.eu/info/law/law-topic/data-protection/international-dimension-data-protection/binding-corporate-rules-bcr_en

20. EDPB Guidelines 3/2018 on the territorial scope of the GDPR, available at: https://edpb.europa.eu/our-work-tools/public-consultations/2018/guidelines-32018-territorial-scope-gdpr-article-3_en

21. See: https://ico.org.uk

22. See “Statement on data protection and Brexit implementation – what you need to do” on 29 January 2020 and updated “Brexit FAQ”, available at: https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2020/01/statement-on-data-protection-and-brexit-implementation-what-you-need-to-do/

23. Article 128 (5) of the EU/UK Consolidated Withdrawal Agreement of 17 October 2019, TF50 (2019) – Commission to EU 27, available at: https://ec.europa.eu/commission/sites/beta-political/files/consolidated_withdrawal_agreement_17-10-2019_1.pdf

24. Article 57, GDPR.

25. Articles 51-59 and Recitals 117-129, GDPR.

26. See: https://ico.org.uk/for-organisations/data-protection-and-brexit/data-protection-if-there-s-no-brexit-deal-3/the-gdpr/ico-and-the-edpb/

27. See Vidal-Hall v Google Inc [2015] EWCA Civ 311 [2016] QB 1003 see: https://www.judiciary.uk/wp-content/uploads/2015/03/google-v-vidal-hall-judgment.pdf and Lloyd v Google [2018] EWHC 2599, see: https://www.judiciary.uk/wp-content/uploads/2018/10/lloyd-v-google-judgment.pdf

28. Section 33 of EU (Withdrawal Agreement) Act 2020.

29. GDPR, Article 27.

30. GDPR, Article 97.

Also published by DataGuidance