Cybersecurity: Key Data Security Sources for Surviving Covid-19

Briefing  

The coronavirus pandemic has created an explosion in information security awareness and a sense of hyper vigilance. Cybersecurity attacks have increased, especially malware, phishing, vishing and ransomware. As cyber awareness increases, boards, leadership teams and individuals need access to the most reliable sources of information and advice. Excellence, expertise and the ability to communicate security threats, risks, priorities, trends and effective responses are crucial. These trusted insights are vital for companies and organisations.  

Leading Data Security Sources: Centres of Excellence

The organisations below have consistently helped companies, organisations and individuals to identify threats, improve controls, increase training and reduce the risk of cybersecurity breaches and loss of reputation. Covid-19 has reinforced their importance. They understand the national and international security landscape. Their experience spans many sectors. Several of the organisations play a key role in national cybersecurity strategies and so are trusted by governments and the public services.   The organisations raise awareness, issue threat alerts, produce guidance, publish analysis, create training materials, lead certification activities, respond to data breaches, secure critical national infrastructure and work with companies and organisations to improve their cyber resilience.

UK National Cyber Security Centre (NCSC)

The NCSC was created in 2016 and spun out of the UK’s GCHQ. It combines the CESG (GCHQ’s information security arm), the Centre for Cyber Assessment (CCA), Computer Emergency Response Team UK (CERT UK) and the cyber-related work of the Centre for the Protection of National Infrastructure (CPNI). It has responsibilities across government, for critical national infrastructure protection and the national cyber security strategy. Its guidance, standards-setting, alerts, website, social media, work with all sectors make it a leader in information security.  

National Institute for Standards and Technology (NIST)

NIST is non-regulatory agency of the United States Department of Commerce with a central role of promoting innovation and industrial competitiveness. Its main laboratory programmes include nanoscale science and technology, engineering, information technology, neutron research, material measurement, and physical measurement. For cybersecurity and data privacy, its standards and frameworks are very popular and underpin the information systems of organisations around the world. This work is supported by the Computer Security Resource Center (CSRC). Its guidance, standards, measurements, publications, website and social media output are authoritative.  

The European Agency for Cyber Security (ENISA)

ENISA is an agency of the European Union, created in 2005 and located in Athens and Heraklion in Greece. The agency works with EU Members States to advise, offer solutions and improve cybersecurity capabilities. It builds capacity to respond to large cross-border cybersecurity incidents or crises. It has developed cybersecurity certification schemes since 2015. ENISA acts as a key centre of expertise for member states, EU institutions and private organisations on network and information security. Its guidance, CERT co-ordination, standards, certification schemes, publications, website and social media output are highly influential.  

United States Computer Emergency Readiness Team (US-CERT)

US-CERT analyses and reduces cyber threats, vulnerabilities, disseminates cyber threat warnings and coordinates incident response activities. It uses advanced network and digital media analysis to identify malicious activity targeting networks in the United States and abroad. US-CERT is part of the Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA). Its work includes threat analysis and information sharing, digital analytics, operations, communications and international work. Its publications, advisories, alerts, analysis, advice, website and social media output are respected. Its unique selling point is to analyse and disseminate information about the most persistent international cybersecurity threats.

Federal Bureau of Investigations (FBI) – Cyber Division

Created in 2002, the FBI’s Cyber Division leads US national effort to investigate and prosecute internet crimes, cyber based terrorism, espionage, computer intrusions and major cyber fraud. It proactively informs the public about current trends in cybercrime. Its three key priorities are computer intrusion, identity theft and cyber fraud. It works with other agencies and takes part in cross-border initiatives.

Other Influential Data Security Organisations, include:

Australian Cyber Security Centre

Canadian Centre for Cyber Security

National Cyber Security Centre (Ireland)

National Cyber Security Centre (Netherlands)

National Cyber Security Centre (New Zealand)

The National Cybersecurity Agency of France

Cyber Security Agency of Singapore

PS102020

Cybersecurity and Cyber Resilience in the FinTech Sector

Article

The FinTech sector was valued at €140 billion globally in 2018 and is estimated to more than double in size to €431 billion by 2022. In the EU, FinTech investments increased by nearly 300% in 2018 from the previous year, to €37 billion. The FinTech sector’s aims of transforming financial services delivery and offering innovative data-rich services makes it highly attractive for venture capital. As the sector expands, the risks of hacking, cybercrime, cybersecurity incidents, and personal data breaches increases. FinTech faces unique cybersecurity challenges but with the application of standards, tools, and strategies the sector can remain proactive and cyber resilient.

FinTech’s Unique Cybersecurity Landscape

The FinTech sector is a series of related financial technologies. The sector is, by nature, innovative and data-driven, with ever expanding boundaries. The ecosystem includes large traditional banks, financial services providers, challenger banks, and a wide range of start-ups. Key FinTech services include payments, alternative finance, smartphone-based mobile retail banking, currency exchange services, investing services, and cryptocurrencies. The edges of FinTech stretches into ‘InsurTech’ and the more multifaceted ‘RegTech’ sector. FinTech’s growth, innovative use of data, and user-focus makes it a unique target for cybercrime and cybersecurity threats.

FinTech actively uses new technologies, data analytics, Big Data, artificial intelligence, robotic process automation (RPA), blockchain, and biometrics. The sector is an evolving mix of diverse data points and a large footprint of endpoints and devices. The sector is home to various data sets, including financial transactions, payment card, credit report, geolocation, and special categories of personal and other sensitive data. As a result, it is an increasing target for cybercriminals, cybersecurity incidents, and personal data breaches. Distributed denial-of-service attacks are increasingly common. Ransomware, malware and phishing attacks are also growing.

A Mix of Rules and Regulations

In the EU, FinTech as a combined sector is not highly regulated. However, depending on the type of FinTech organisation, types of technologies deployed, or the types of data used, various laws and rules will apply data security norms. Traditional banks, challenger banks, and smartphone-based financial services providers face the most demanding cybersecurity rules. The EU’s Payment Services Directive (EU 2015/2366) (‘PSD2′) lead the way for open banking by allowing banks to make their customers’ personal or business current-account information accessible to external third-party providers. The PSD2 supercharged the growth of EU FinTech. FinTech’s are also governed by a mixture of EU banking authorities, EU financial services laws, central banks, and national financial services regulators. Organisations that are part of critical national infrastructure fall within the Directive on Security Network and Information Systems (Directive (EU) 2016/1148) (‘the NIS Directive’). Their supply chains, which can include FinTechs, are indirectly regulated by these cybersecurity standards. FinTechs that use direct marketing tools, cookies, and similar technologies must comply with the Directive on Privacy and Electronic Communications (Directive 2002/58/EC) (‘the ePrivacy Directive’) and the related national laws in each EU country.

The General Data Protection Regulation (Regulation (EU) 2016/679) (‘GDPR’) provides overarching rules to encourage cybersecurity and data protection compliance. The GDPR’s rules on transparency, accountability, security of data processing, personal data breach notifications to regulators and individuals, Privacy by Design, Privacy by Default, Data Protection Impact Assessments (‘DPIAs’), and the appointment of data protection officers, offer FinTechs a baseline for compliance, which they must build on to reflect their specific context and risk-profile.

EU public policy has acknowledged the need to make cybersecurity the number one priority in FinTech planning. The European Commission adopted the EU FinTech Action Plan (‘the Action Plan’) in 2018 with the clear aim of placing cybersecurity and integrity at the heart of FinTech growth and development. The Action Plan encourages a security by design approach. The European Banking Authority also published a FinTech Roadmap to set out its priorities for 2018/2019. The European Union Agency for Cybersecurity (‘ENISA’), is, at the time of publication, working on an EU certification framework for ICT security products and services, increasing access to threat intelligence and information sharing, encouraging penetration and resilience testing, as well as increasing cybersecurity training and awareness. In 2019, the European Supervisory Authorities published advice to the European Commission on the strengthening of EU cyber and IT security regulation in the financial sector. A key recommendation was to develop an EU oversight framework for third party providers active in financial services, especially cloud service providers. Another recommendation was to develop an EU-wide framework for testing the cyber resilience of important financial institutions. Globally, at an intergovernmental level, the G7, the G20, the Organisation for Economic Co-operation and Development, the International Monetary Fund, and the World Bank are also working on FinTech cybersecurity and information security for financial services.

FinTech Cybersecurity and Cyber Resilience Standards and Tools

Security by design (and security engineering) should underpin FinTech infrastructure, services, software, and applications, so that security is built-in by default, allowing a secure environment at the core and the endpoints.

International Information Security Standards, such as ISO 27001, allow FinTechs to create and manage high quality information systems. However, newer standards, such as ISO 27032:2012 for improving the state of cybersecurity and ISO 27701:2019 for extending privacy information management system standards, can be used to mature the level of compliance. FinTechs should also seek to apply the Payment Card Industry Data Security Standard, if applicable, the National Institute on Information Standards and Technology (‘NIST’) Cybersecurity Framework, financial services IT standards, and other sectors norms in the countries in which the FinTech operates.

A zero-trust approach and continuous testing allow FinTechs to significantly fortify their networks, endpoints, and level of resilience. Zero-trust architecture and zero-trust networks are based on the principle that actors, systems, or services operating from within the security perimeter should not be automatically trusted, but must be verified to initiate access and continue access to IT services.

DPIAs allow FinTechs to better understand their personal data use and demonstrate GDPR compliance. DPIAs focus on high-risk data processing and enable risk identification, remediation, risk acceptance, risk reduction, and risk management. At the system design stage, DPIAs can help FinTechs to identify and adopt Privacy by Design.

Supply chain cybersecurity compliance, strength, and resilience are vital for business continuity and disaster recovery. FinTechs should build-in IT flexibility and backup options, especially for cloud services. Supply chain partners must be held to high standards of cybersecurity compliance. They should also display cybersecurity agility and responsiveness to react to threats, risks, near-misses, and breaches.

Proactive Cyber Resilience

The language of cybersecurity can often appear binary and prosaic to developers, FinTech founders, senior leaders, and boards. Cybersecurity is often presented as a problem to be fixed to allow growth and profits to take place uninterrupted. In truth, cybersecurity is fluid, it is an enabler, and an adept partner to FinTech’s most ingenious innovations. In today’s complex global supply chains, with its aggressive and evolving threat landscape, cybersecurity must be aligned with proactive cyber resilience.

NIST defines cyber resilience as ‘the ability to prepare for and adapt to changing conditions and withstand and recover rapidly from disruptions. Resilience includes the ability to withstand and recover from deliberate attacks, accidents, or naturally occurring threats or incidents.’ Proactive cyber resilience is a more suitable and beneficial aim, allowing organisations to promote a broader application of cybersecurity to include disaster recovery, business continuity, intelligent cyber insurance, and supply chain strength and flexibility. FinTech’s dynamism, complexity, and expanding boundaries require security engineering and cybersecurity to be core competences within the sector’s ecosystem and where the watchword is always resilience.

For Enquiries:

contact@privacysolved.com

London: +44 207 175 9771 \ Dublin: +353 1 960 9370

www.privacysolved.com

Also published by DataGuidance

Five Key Things to Know about Dubai DIFC Data Protection Law 2020

The Dubai International Financial Centre (DIFC) Data Protection Law 2020 (DP Law) applies to the DIFC financial services free zone in Dubai, United Arab Emirates and took effect on 1 July 2020. The DIFC DP Law protects the personal data held and processed by organisations that are registered in the DIFC as well as linked external organisations. New data protection rights include the right to access personal data, the right to data portability, the right to withdraw consent, the right to object to automated decisions (including profiling) and the right not to suffer discrimination for exercising data protection rights. Businesses have an overriding duty to demonstrate compliance with the data protection principles. The DIFC Commissioner of Data Protection is the regulator. Regulator enforcement starts on 1 October 2020.

1.What types or organisations are covered by DIFC DP Law?

The law applies to businesses that are registered in the DIFC or businesses that process personal data in the DIFC as part of stable arrangements. Businesses that process data on behalf of these organisations, such as their suppliers, are also covered by the law.

2. What types of data or information are covered by DIFC DP Law?

The DIFC DP Law protects personal data which is defined as information that identifies or makes living individuals identifiable. Identified or identifiable means reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors about an individual’s biological, physical, biometric, physiological, mental, genetic, economic, cultural or social identity.

3.What are the main DIFC DP Law obligations for businesses?

Businesses must:

  1. Comply with additional data protection principles of accountability (demonstrate compliance), transparency and process personal data in line with the rights of individuals.
  2. Appoint a Data Protection Officer (DPO), if they are DIFC bodies or carry out high risk processing on a systematic or regular basis. Other controllers or processors may appoint DPOs.
  3. Report data breaches as soon as practicable in the circumstances to the DIFC Commissioner of Data Protection and to individuals affected (if the breach is a high risk to security or individual rights).
  4. Register with the regulator and publish detailed data protection notices.
  5. Complete Data Protection Impact Assessments (DPIAs) for high risk data processing.

4. If businesses comply with the European Union’s General Data Protection Regulation (GDPR), will they automatically comply with DIFC DP Law?

Yes, in large part, but not completely. GDPR and DIFC DP Law have different scopes, definitions, special provisions and compliance requirements. However, there are important similarities. DIFC DP Law was enacted to include provisions that largely mirror GDPR. It is likely that the DIFC will make an application to the European Union (EU) for an adequacy decision to ease international data transfers between the DIFC and the EU. GDPR data mapping and records of processing activity logs can help to identify DIFC DP Law impacted personal data. GDPR Privacy Notices, policies and GDPR processes used to respond to GDPR rights can assist DIFC DP Law compliance, but these must be tailored. Data processing agreements and online notices must be specifically updated.

5. Does the DIFC DP Law apply to foreign based companies and what are the penalties for breach of the law?

Yes, it can. If foreign businesses process personal data and are registered in DIFC or process personal data in the DIFC as part of stable arrangements in the DIFC, then the DIFC DP Law will apply. The law also applies to businesses that process data on behalf of organisations registered in the DIFC or for organisations that process data in the DIFC as part of stable arrangements. The DIFC Commissioner for Data Protection can impose administrative fines of up to $100,000. DIFC Courts can order businesses to pay compensation to individuals.

Automated Decisions, Algorithms, Profiling and AI: EU Data Protection Lessons

Article

Companies and organisations should ensure that their data protection compliance is not reduced to a set of policies and procedures, quarterly reports and annual reviews. Data protection outcomes should not be synonymous with the introduction of enterprise privacy software, compliance team updates of controls or data privacy as intractable legal and IT add-ons to be overcome. Effective data protection should be dynamic and integral to day to day activities, in the way that workplace health and safety, financial probity and corporate good conduct flows through organisations, affecting almost every decision. Data protection should not play catch-up to digital transformation initiatives, IT strategy changes, research and development priorities or expansions of the supply chain. Data protection principles should be applied consciously to strengthen an organisation’s core DNA and operating model. As a result, whenever personal data are collected, stored or used, data protection should become a byword for responsible data management, excellent data ethics, protecting individual personal data, accountability, security, resilience, profitability, trust and innovation.

Data Protection by Design and by Default

In the same way that financial transparency, environmental impacts and board accountability are key measures for listed companies, data protection should be designed into an organisation’s way of doing business, so that it becomes second nature. The EU’s General Data Protection Regulation (GDPR) has increased the prominence and status of Data Protection by Design, Security by Design and Privacy by Design (PbD) practices. The data protection principles of transparency, accountability and data minimisation are crucial. Data Protection Impact Assessment (DPIA) is a practical tool to practice high level data governance, demonstrate compliance and add vital data intelligence to an organisation’s knowledge base. Data Protection should  be operationalised, at the beginning of decision-making processes and information life cycles to maximise the planned outcomes.  Poor data governance should be considered as problematic as poor workplace health and safety, poorly trained staff and financial mismanagement.

Automated Decisions

Automated decisions are assessments, judgements, results and outcomes made by computers without human intervention. These decisions are often made by computer calculations and the outcomes are not checked or verified by humans. These results can have serious economic, financial, political and social implications for individuals. Companies and organisations may carry out automated decisions without full awareness or assessment of its impact or that specific data protection rules apply. The outsourcing of Human Resources functions and other business processes have redirected some automated decisions away from organisations’ direct internal management structures, creating greater risks. However, legal responsibilities and liabilities remain with the organisation that act as the personal data controller.  Automated decisions can be based on assumptions about a person’s skills, decisions, actions, intentions or characteristics. Assumptions can be wrong, out of date or incomplete and cause discrimination, financial loss, loss of opportunity, distress or other damage.  Companies and organisations should be transparent about assumptions made by automated decisions and apply internal quality checks, testing and outcome verification. Individuals affected should also be provided with a way to intervene into the decision-making processes, request human involvement, express their views or question the outcome. 

Algorithms and Strategy

An algorithm is a sequence of defined, computer-implementable instructions, used to solve a type of problem or to perform a computation. Algorithms are present where computers operate. As a result of the exponential growth of computing power, the enormous increase of data and the rise of artificial intelligence, the role of algorithms has become more prominent in everyday business and how organisations operate. As a result, companies and organisations should ensure that they have a clear strategy for the use of algorithms that affect individuals. The strategy should sit with overall business strategies for growth, efficiency, profits and innovation. All strategic outcomes should be quality tested against how they protect individual’s personal data, promote information security (and cybersecurity), encourage data transparency, create data accountability and data fairness (quality and data minimisation).

Profiling

The rise of information technology, online transactions, social media and internet usage around the world have created an explosion of profiling. Companies and organisations may carry out profiling without full awareness or assessment of its impact or that specific data protection rules apply to the practice. Profiling is the use of mathematical formulas, computations or algorithms to categorize individuals into one or more classes or groups. Profiling can also be used to evaluate individual characteristics such as performance at work, economic standing, health, personal preferences, interests, reliability (skill or competence), behaviour, location, movement, intention or priorities. The most intrusive elements of profiling can be the ability to infer information from data and the ability to predict an individual’s future choices or actions. Inferences and predictions can be wrong, biased, incomplete and based on irrelevant data, yet have a substantial effect on individuals, including discrimination, financial loss, loss of opportunities, distress or other damage.  Companies and organisations must be transparent about their use of profiling, have internal quality checks, practice data minimisation and verification. Individuals affected must be able to seek information about their profiles and question the decisions made about them.

The GDPR has one of the most sophisticated regulatory frameworks to deal with profiling and automated decision making. In most cases, automated decision making is categorised as profiling. EU policy makers anticipated the growth of profiling by ensuring that all foreign companies (with or without an EU presence), that profile EU citizens’ behaviour in the EU, fall within the scope of GDPR, even where the profiling operations take place outside the EU. This may not be well understood and may often be ignored by organisations. As well as compliance with the GDPR’s main principles and provisions, profiling should always be accompanied by Data Protection Impact Assessments (DPIAs). These DPIAs must also comply with the requirements of the relevant EU member states’ data protection regulator and local laws. Consulting with the individuals affected and with the data protection regulator could also be required, based on the nature of the profiling.  The Data Protection Officer should support and drive the process of producing high quality DPIAs that are well written, honest, easy to understand, effectively reviewed and updated.

Artificial Intelligence

Artificial Intelligence (AI) is the ability for computer systems or computer-controlled robots to perform tasks normally requiring human intelligence, such as visual perception, speech recognition, decision-making, translation between languages, performing manual tasks, interactions with other computers and interactions with humans. AI is big business and is set to transform the global economy, work, home, education, healthcare and security. The global artificial intelligence market size is expected to reach $390.9 billion US dollars by 2025, according to a report by Grand View Research, Inc. The market is anticipated to expand at a Compound Annual Growth Rate (CAGR) of 46.2% from 2019 to 2025. Companies and organisations should ensure that in building AI systems that algorithms are tested, reviewed and outputs verified. Data sources should be quality checked to remove incomplete data, bias and out of date information. Assumptions and inferences should be robustly tested. These steps are data hygiene and reflect similar GDPR requirements. However, GDPR compliance and relevant data protection and privacy laws should be specifically incorporated into AI data life cycles.

Companies and organisations should ensure that AI is explainable so that individuals affected can increase their understanding and trust can be built. This requirement maps across to the GDPR’s principles of fairness, lawfulness, transparency, purpose limitation, accuracy, integrity, confidentiality and accountability. Frameworks have been published to help organisations manage and explain AI to improve accountability. The European Union High-Level Expert Group on AI has published Ethics Guidelines for Trustworthy Artificial Intelligence. The United States National Institute of Standards and Technology (NIST) has published Four Principles for Explainable Artificial Intelligence. The UK’s data protection regulator, the Information Commissioner’s Office and the Alan Turing Institute, have published joint guidance on Explaining Decisions Made with AI.

Data Protection Lessons

Data Protection maturity can improve companies and organisations key strategic goals of profitability, growth, efficiency, trust, innovation and resilience. Organisations that attempt to grow without robust data protection find that several of their key strategic goals remain uncertain. Their longevity can be at risk because users, customers and supply chain trust are low. Their efficiency and growth are precarious because at any time, a data protection regulator, markets regulator, privacy activists, civil society groups, governments and individuals could start campaigns against their poor data protection practices. Fines, bad publicity, internal staff protests, political interjections and whistleblowers can create a state of inherent instability. Excellence in data protection and data protection by design should be positive and proactive advancements rather than reactive responses. For the future, agility and trust will be important economic drivers. Organisations that understand their data and personal data, explain their data uses, imbed data protection by design and engage with stakeholders about data governance issues will thrive, remain resilient and fulfil their key strategic objectives.

PS082020

Schrems II: Rethinking Privacy Shield & Standard Contractual Clauses

Briefing

On 16 July 2020, the European Union’s highest court, the Court of Justice of the European Union (CJEU) delivered the much anticipated decision in the Max Schrems Case (Schrems 2). The court was asked by Ireland’s High Court to decide on key mechanisms for international transfers of personal data from the EU to the United States. The underlying cases arose out of Austrian privacy activist Max Schrems’ complaint against Facebook and Ireland’s Data Protection Commission over interpretation of key data protection provisions. Max Schrems objected to US surveillance of foreign nationals which conflicted with the General Data Protection Regulation (GDPR). The court decided that US surveillance laws and practices stand in opposition to the GDPR’s fundamental human rights protection of EU citizens. As a result, personal data transfers are non-compliant to EU law and need special attention, assessment, reviews and additional safeguards to make these compliant. The case has been called constitutional and cannot be appealed.

Privacy Shield

The Court of Justice of the European Union found that the EU/US Privacy Shield data protection adequacy decision agreed in 2016 is invalid. Personal data transfers based on this mechanism must cease.  EU citizens have no real judicial remedy or equivalent protections in the US under Privacy Shield. The Swiss/US Privacy Shield remains in force but the Swiss Data Protection Authority is reviewing its position. Privacy Shield continues to operate internally in the USA based on federal enforcement mechanisms, US laws and the role of domestic regulators.

Standard Contractual Clauses (SCCs)

The European Commission’s Data Protection Standard Contractual Clauses remain lawful and enforceable. However, the court has insisted that Data Exporters (in the EU) and Data Importers (in foreign countries) must carry out more detailed checks to ensure that foreign laws and data governance rules are compatible with GDPR. Data Importers must inform Data Exporters if they are unable to comply with EU data protection law. Data Exporters must refuse to transfer personal data where specific personal data transfers are incompatible. EU Data Protection Authorities are also encouraged to intervene and review Standard Contractual Clauses and be prepared to withhold or withdraw authorisations for international personal data transfers. On 4 June 2021, the European Commission published its final updated Standard Contractual Clauses that comply with GDPR and the Schrems 2 case. On 21 March 2022, the UK published its new international data transfer regime.

Responses and Actions

  1. Companies and organisations should assess their exposure to Privacy Shield, work towards stopping these personal data transfers and investigate substitute arrangements. There is no grace period for compliance.
  2. Wait for and act on concrete guidance from each relevant EU Member State’s Data Protection Authority, the European Data Protection Board (EDPB) and the European Commission.
  3. Wait for the European Commission’s new GDPR-approved Standard Contractual Clauses (June 2021) and implement these by December 2022.
  4. Begin to review high value and high risk contracts that contain Standard Contractual Clauses (SCCs) that allow transfers to the USA.
  5. Review Binding Corporate Rules (BCRs) to see if personal data transfer protections from the EU to the USA need to be strengthened or varied.

Resources

EU / US and Swiss / US Privacy Shield Home Page

Schrems II Case Press Release

Schrems II Case Full Judgment

Schrems II European Data Protection Board (EDPB) Frequently Asked Questions

Schrems II US Federal Trade Commission (FTC) Statement

Schrems II US Secretary of Commerce Statement

Schrems II Joint Statement from European Commission and US Department of Commerce

Schrems II Ireland Data Protection Commission (DPC) First Statement

Schrems II UK Data Protection Commissioner’s Office (ICO) First Statement and Updated Statement

Schrems II European Data Protection Board (EDPB) Taskforce on Post-Schrems II Complaints

Schrems II US Department of Commerce, US Justice Department & US Office of the Director of National Intelligence White Paper on US Privacy Safeguards for SCCs and other Legal Bases

Schrems II European Data Protection Supervisor (EDPS) Strategy for EU Institutions to comply with Schrems 2 Ruling

Schrems II European Data Protection Board (EDPB) Supplementary Measures for data transfer tools to ensure GDPR compliance – Consultation

Schrems II European Commission Standard Contractual Clauses (SCCs) 2020 – Consultation  

Schrems II European Data Protection Board (EDPB) and European Data Protection Supervisor (EDPS) Joint Opinion 2/2021 on Standard Contractual Clauses for the Transfer of Personal Data to Third Countries

European Commission Final Standard Contractual Clauses (SCCs) for Data Controllers and Data Processors and also International Data Transfers – June 2021

UK Information Commissioner’s Office (ICO) Consultation on UK International Data Transfers and UK Standard Contractual Clauses – August 2021

UK Information Commissioner’s Office (ICO) Response to DCMS Consultation “Data: A New Direction” – October 2021

UK GDPR Final International Personal Data Transfers Scheme and Documents – March 2022

European Commission announcement of an EU/US Trans-Atlantic Data Privacy Framework Agreement in Principle – March 2022

White House Briefing Room announcement of an EU/US Trans-Atlantic Data Privacy Framework Agreement in Principle and FactSheet – March 2022

European Commission Questions and Answers (Q&As) for the two sets of EU 2021 Data Protection Standard Contractual Clauses – May 2022

For Further Assistance, contact PrivacySolved:

Telephone (London): +44 207 175 9771

Telephone (Dublin): +353 1 960 9370

Email: contact@privacysolved.com