ICO urges UK businesses

ICO urges UK businesses: ensure compliance to data protection law before the end of the UK’s transition.

ICO urges UK businesses to ensure compliance to data protection law before the end of the UK’s transition on December 31st 2020. 

 

December 31st 2020 will officially end the transitionary period for the UK, out of the EU, and the ICO is calling on UK businesses to ensure that if they are impacted by data protection law, that they should take the necessary steps to ensure continued lawful data flow from the EU. The ICO advises that any businesses receiving data from organisations in the EU or European Economic Area (EEA, which includes the EU, Iceland, Norway and Liechtenstein) will need to take action to ensure the flow of data doesn’t stop. 

 

Many SMEs depend on the flow of personal data to operate, and the ICO seeks to aid these businesses during the transition. 

Personal data applies to anything that relates to an identifiable individual whether it be information on customers or staff. HR records, customer details, payroll information and information collected through cloud services are all classified as personal data and will possibly be affected. The ICO recognises that sharing personal data is essential to running the majority of SMEs and that smaller organisations may not have dedicated data protection officers or specialists to help with the preparations. They have, as a result, published a statement advising businesses on steps they can take before January 1st to ensure continued compliance. 

The ICO urges UK businesses to maintain compliance with the DPA 2018 and the GDPR, and to double check their privacy information.

 

Businesses in the UK will need to continue to ensure compliance with the GDPR and DPA 2018. However, as it relates to the exchange of data between entities in the UK and the EU, as of January 1st 2021, businesses will need to ensure that they have safeguards in place to ensure that the continued flow of data is lawful. The ICO has gathered some guidance and resources on its website and urges businesses to make use of this to determine the actions they may need to take if they use personal data. In addition, businesses should review their privacy information and other documentation for possible changes that need to be made at the end of the transition period.

 

For most businesses and organisations, the ICO suggests Standard Contractual Clauses (SCCs) to keep data flowing on EU-approved terms. 

The ICO statement suggests that standard contractual clauses or SCCs may be the best option for businesses that use personal data and want to ensure their data transfers are EU-approved. As businesses in the UK will officially be treated as non EU processors or controllers, come January first, SCCs which have proven to be a sufficient safeguard for the transfers for data between controllers and processors within the EU and internationally, have been recommended as the best option for UK businesses to adopt post-transition. 

 

Do you make international data transfers to third countries? We can help you. Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, transfer impact assessments and Data Protection Officer outsourcingContact us today.

Healthcare providers

Healthcare providers’ broad data access authorisations lead to fines of over 2.9 million euros.

Healthcare providers’ broad data access authorisations lead to fines of over 2.9 million euros in Sweden. 

 

The Swedish DPA, after reviewing eight healthcare providers found that there were deficiencies in the way that they protected access to electronic health records. The assessments primarily examined whether the health care providers had conducted the needs’ and risk analyses required in order to adequately assign an access authorisation for personal data in the electronic health records.

 

 The healthcare providers’ privacy deficiencies were mainly due to the fact that they neglected to carry out sufficient assessments to determine adequate access authorisation.

 

All health care providers must be able to demonstrate a sufficient level of security for the personal data in the electronic health record systems. They must carry out a thorough analysis and assessment of the personnel’s need to access information in the health records and the risks that accessing patient data includes, as outlined in the Swedish Patient Data Act, which complements the GDPR. It is through these analyses that healthcare providers are able to appropriately assign the personnel their level of authorisation. Without this, organisations cannot guarantee patients’ right to privacy protection. 

 

The healthcare providers’ privacy deficiencies were mainly due to the fact that they neglected to carry out sufficient assessments to determine adequate access authorisation to electronic personal health records in seven of the eight reviewed cases. While the eighth healthcare provider may have conducted the needs and risk assessment, the analysis included some shortcomings. 

 

Seven of the eight healthcare providers assessed, were hit with administrative fines of varying amounts, up to EUR 2.9 million.

 

Seven of the healthcare providers’ deficiencies were so serious that they resulted in administrative fines of between approximately a quarter million euros to EUR 2.9 million. The decision on the amount of the fine differs significantly based on whether the fine is charged to a private company or a public authority. For companies, the maximum fine is EUR 20 million or four percent of the company’s global annual turnover, whichever is higher, while the maximum fine for authorities in Sweden is approximately 1 million euros.

 

Based on the conclusions of these audits, the Swedish DPA has developed guidelines regarding the obligation to conduct needs and risk analyses. 

 

The Swedish DPA has developed guidelines indicating the importance for healthcare providers in ensuring that they carry out needs’ and risk analyses. This is in an effort to ensure that patients are given the privacy protection that they are entitled to, by helping healthcare providers to conduct these analyses. It is important to note that these assessments must be carried out before any access authorisation is assigned to personnel in a health record system.

 

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR and Data Protection Act 2018 in handling customer data? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance.

 

 

AI systems COVID-19

Are the AI Systems Used for Contact Tracing of COVID-19 Ethical?

Are the AI systems used for contact tracing of COVID-19 ethical? In our latest vlog, we explore the extent to which the use of these systems are ethical, and why.

 

With many European Nations launching the Pan-European Privacy Preserving Proximity Tracing (PEPP-PT), to release software code which can be used to create contact tracing apps, tracking the possible transmission of COVID-19, many wonder about the extent to which this would be ethical. The apps in question would use phone Bluetooth signals to track users’ proximity to each other, and would then inform users if they had been in the proximity of someone who had tested positive for the virus. Last week, we explored the use of AI, in tracking or preventing the spread of COVID-19. This week, we take a deeper look at the ethical implications of the use of such technology in our society.

 

According to Article 9 of the GDPR, certain categories of personal data can only be processed under specific circumstances. These special circumstances include things like vital interests, and public health. With regard to public health as a condition for processing personal data, this condition is met not just by virtue of it being for reasons of public health interests. According to Data Protection Act 2018, the processing would also need to be carried out by, or under the responsibility of a health professional, or by another person who in the circumstances owes the duty of confidentiality under the law. Article 22 of the GDPR states that without the subject’s explicit consent, profiling is only allowed where authorised by Union or Member State law.

 

With all this considered, the ethics of the AI systems used in the fight against COVID-19 would play a vital role in maintaining accuracy and non-discrimination. While these measures seem to be very helpful right now, for the sake of public health, there lies the risk of these measures persisting beyond the COVID-19 pandemic. In our latest vlog, we explore the ethics of the use of these AI systems. 

Please subscribe to our YouTube channel, be updated on future content. Do you have questions about how to navigate data protection laws during this global coronavirus pandemic in your company? We can help you. Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including Data Protection Impact Assessments, AI Ethics Assessments and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.