ICO fine

UK Retailer fined half a million pounds due to poor security safeguards

The Information Commissioner’s Office (ICO) has imposed a £500,000 fine on UK retailer DSG Retail Limited after a ‘point of sale’ computer system was compromised as a result of a cyber-attack, affecting at least 14 million people.

Ok, so your company accepts credit cards payments for product sales/service offerings. You value security so youve ensured that your website is https (hypertext transfer protocol secure) in order to provide a secured communication over the digital network. But is this enough to safeguard this highly sensitive personal data, which your customers are using in online and offline sales? Have you set up adequate protocols to thwart any malware or hacker attempts? Or do you believe this isnt something you need to worry explicitly about because… well your site is https. “Secure” is built into the acronym, so what could possibly go wrong? A lot actually, including the possibility of a hefty fine particularly if your clientele are residents within the EU or UK. So we highly implore you to take a detailed look into your companys safeguards least you find yourself in hot water, much like a UK Retailer, DSG Retail Limited (DSG) who has been fined half a million pounds by the ICO for failing to keep personal information secure.

A January 9, 2020 ICO news article explains that  an ICO investigation revealed that an attacker had installed malware on 5,390 tills at DSGs Currys PC World and Dixons Travel stores between July 2017 and April 2018, and had collected personal data for the nine month period before the attack was detected. DSGs inadequate security systems therefore resulted in unauthorized access of some 5.6 million payment cards details and the personal information of approximately 14 million people, including full names, post codes, email addresses and failed credit checks from internal servers, the ICO further notes.

Our investigation found systemic failures in the way DSG Retail Limited safeguarded personal data. It is very concerning that these failures related to basic, commonplace security measures, showing a complete disregard for the customers whose personal information was stolen . . . The contraventions in this case were so serious that we imposed the maximum penalty under the previous legislation, but the fine would inevitably have been much higher under the GDPR,ICO Director of Investigations, Steve Eckersley, is quoted in the news article.

The £500,000 ICO fine was levied under the Data Protection Act 1998 since the breach took place before the GDPR and DPA 2018 came into effect. Security of Processing is covered under article 32 of the GDPR.

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR and UK Data Protection Act? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing.  We can help your company get on track towards full compliance. Contact us today.

Ban on facial recognition technology

The Use of Facial Recognition Technology in public places in the EU could be temporarily banned

leaked EU Commission white paper proposes that the EU  place a 3 to 5 year ban on facial recognition technology within public places.

 

As exciting as it all seemsthis ability to instantly gain access, perform transactions or even pay bills by simply scanning your face!there is without a shadow of a doubt, a scary dark side.  The fact that in todays world there is an app which allows others to compare a photo to a database of more than 3 billion photos in order to identify an individual paints a vivid picture of the privacy risks associated with facial recognition.. It is no surprise then that the EU Commission is considering a placing a ban on facial recognition technology.

 

An Euractiv news report explains that the European Commission is considering measures to impose a temporary ban on facial recognition technologies used by both the public and private sector. This proactive suggestion is presented in an EU draft white paper obtained by Euractiv.

 

Euractiv expounds that the Commission paper, which gives an insight into proposals for a European approach to Artificial Intelligence, stipulates that a future regulatory framework could include a time-limited ban on the use of facial recognition technology in public spaces.” It is further proposed that during the ban, a sound methodology for assessing the impact of facial recognition and possible risk management measures could be identified and developed.

 

Facial recognition falls under the umbrella of biometric data and must therefore be collected, handled and saved in accordance with the GDPR and the UKs Data Protection Act. Aphaia, presents a more detailed view of the GDPRs stipulations concerning facial recognition here.

Does your company utilize biometric data such as fingerprinting, voiceprinting and facial recognition? If yes, failure to adhere fully to the guidelines and rules of the GDPR and Data Protection Act 2018 could result in a hefty financial penalty. Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. Contact us today.

 

 

European Supermarket Chain may face inspection over new fingerprinting system

Belgian data protection authority, Gegevensbeschermingsautoriteit, may launch an investigation into supermarket chain Carrefour’s fingerprint payment system.

 

Theres no denying that we currently live in a fast paced, highly technological era. One which constantly ushers in new means of identifying individuals and processing digital paymentsall geared towards increased convenience. At this stage, thanks to mobile phone advances, fingerprinting may very well be one of the more widely used means of identification but its uses are certainly not confined merely to mobile devices. In fact just this week, one of Europes largest supermarket chains, Carrefour, announced that it will organise a pilot project allowing clients to pay for their groceries with their fingerprints in a store in the centre of Brussels.  

 

 

A report from the Brussels Times explains that the Carrefour pilot project will enable clients to pay by scanning their finger at the cash register, after which the money will disappear from their bank account. And while this may result in faster check out times and a more convenient means of shopping there are undoubtedly privacy and security risksrisks which the Belgian data Protection authority would not only like consumers to be aware of but which may warrant and lead to an investigation by the DPA.

 

Referencing a report from De Standaard,  the Brussels Times presented the following comment from David Stevens, president of the GBA;

 

We asked Carrefour a few questions and discovered that a test had already taken place . . . It turned out that Carrefour had already collected fingerprints. Now that weve heard the news about the new experiment with fingerprint payments, theres a good chance well send our inspectors. I cannot yet formally confirm that we will do that, but I think there is a good chance.

 

….that is more than just a signature on paper. Customers really have to understand the risks. If, through hacking, your password falls into the wrong hands, you can replace it. But you cannot just change your fingerprint, face or the iris of your eye. Hence the strict rules,Stevens is further reported to have said.

 

Fingerprint risks are covered by GDPR Article 30, which generically refers to online identifiers, which means data protection rules directly apply to fingerprint. This is because fingerprinting constitutes the use of biometric datai.e a way to measure a persons physical characteristics to verify their identity. Biometric data is therefore personal data which must be processed on a lawful basis in compliance with GDPR and the UKs Data Protective Act.

 

Does your company utilize biometric data such as fingerprinting, voiceprinting and facial recognition? If yes, failure to adhere fully to the guidelines and rules of the GDPR and Data Protection Act 2018 could result in a hefty financial penalty. Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. Contact us today.

 

 

 

ICO launches consultation on the draft direct marketing code of practice

Public consultation for the UK draft direct marketing code of practice is now open.

Earlier this month the ICO launched a public consultation on its draft direct marketing code of practice.  This draft code has been produced in accordance with section 122 of the Data Protection Act 2018.

According to the ICO, the draft code of practice aims to provide practical guidance and promote good practice in regards to processing for direct marketing purposes in compliance with data protection and e-privacy rules. The ICO further notes that The draft code covers the legislation as it currently stands which for e-privacy means the Privacy and Electronic Communications Regulations 2003 (PECR).

Who is the code applicable to?

The code applies to any business which processes personal data for direct marketing.

Directing marketing is expounded by the ICO to include the promotion of aims and ideals as well as advertising goods or services. Any method of communication which is directed to particular individuals could constitute direct marketing. Direct marketing purposes include all processing activities that lead up to, enable or support the sending of direct marketing,says the ICO.

Draft Direct Marketing code of practice overview:

The code provides guidance in regards to:

Planning marketing activities – Data protection by design
Generating leads and collecting contact details
Profiling and data enrichment
Sending direct market messages
Online advertising and new technologies
Selling or sharing data
Individual rights

The ICO notes that by following the code along with along with other ICO guidance will enable companies to comply with the GDPR and PECR.

The public consultation on the draft code will remain open until 4 March 2020.

Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments and Data Protection Officer outsourcing.  Contact us today.