The ICO has published a new Age Appropriate Design code to protect children’s privacy online.
On January 21, the ICO published its final version of the Age Appropriate Design Code. This comes months after the ICO’s completion of a wide-ranging public consultation and engagement period which included meetings with individual organisations, trade bodies, industry and sector representatives, and campaigners.
The first of its kind, the ICO says this new code sets out 15 standards expected of those responsible for designing, developing or providing online services like apps, connected toys, social media platforms, online games, educational websites and streaming services. Additionally the ICO notes that this new Age Appropriate Design code covers services likely to be accessed by children and which process their data. Rooted in the GDPR, the code also gives practical guidance on data protection safeguards that ensure online services are appropriate for use by children.
“The code will require digital services to automatically provide children with a built-in baseline of data protection whenever they download a new app, game or visit a website. That means privacy settings should be set to high by default and nudge techniques should not be used to encourage children to weaken their settings. Location settings that allow the world to see where a child is, should also be switched off by default. Data collection and sharing should be minimised and profiling that can allow children to be served up targeted content should be switched off by default too,” expounds the ICO.
“The code says that the best interests of the child should be a primary consideration when designing and developing online services.”
At this current stage, the ICO’s first ever Age Appropriate Design Code has been submitted to the Secretary of State and is expected to be laid in Parliament sometime this year. After that, organisations will have 12 months to update their practices before the code comes into full effect which the ICO projects will occur by autumn 2021.
Does your company offer online services likely to be accessed by minors? If so, it will be imperative that you adhere to the UK Data Protection Code once it is effected. Aphaia’s data protection impact assessments and Data Protection Officer outsourcing will assist you with ensuring compliance.
Today’s blog provides an overview of the GDPR’s expectations regarding employer/employee relations; specifically in terms of company policies on communication and security.
If you work or have worked in the corporate world then you’re no stranger to the fact that in order to protect the organization, most companies have in place internal policies and procedures which speak to communications, internet usage, security access and personal data protection. Meanwhile across the board, more and more companies are utilizing video surveillance for a host of security and protective measures. But do these policies and video surveillance systems comply with the GDPR?
A recent investigation by the Hellenic DPA in regards to the lawfulness of access to and inspection of deleted employee emails as well as the use of surveillance on company premises offers a prime opportunity to delve into some of the GDPR mandates.
The Ηellenic DPA in response to a complaint conducted an investigation regarding the lawfulness of personal data processing on a server of ‘ALLSEAS MARINE S.A.’, as well as the lawfulness of access to and inspection of deleted emails of a senior manager for whom there was suspicion that he had committed unlawful acts against the company’s interests.
According to the EDPB article the Hellenic Data Protection Authority deemed that Allseas Marine S.A had in fact complied with the requirements of the GDPR and that its internal policies and regulations provided for a ban on the use of the company’s electronic communications and networks for private purposes, and for the possibility of carrying out internal inspections. As such, the Hellenic DPA found that the company had a legal right under Articles 5(1) and 6(1)(f) of the GDPR to carry out an internal investigation searching and retreating employee’s emails.
However as it related to Allseas Marine’s utilization of a closed-circuit video surveillance system, the DPA determined that the system had been installed and operated illegally. Further, the recorded material submitted to the Authority was considered illegal. The EDPB article further noted that the Hellenic Authority found that the company did not satisfy the employee’s right of access to his personal data contained in his corporate PC.
As a result of its investigation the Hellenic DPA also determined that the company did not satisfy the employee’s right of access to his personal data contained in his corporate PC.
In response to these GDPR infringements the Hellenic DPA has therefore mandated Allseas Marine S.A to take several corrective measures in order to comply with the GDPR. Allseas Marine was also fined €15,000.
The Information Commissioner’s Office (ICO) has imposed a £500,000 fine on UK retailer DSG Retail Limited after a ‘point of sale’ computer system was compromised as a result of a cyber-attack, affecting at least 14 million people.
Ok, so your company accepts credit cards payments for product sales/service offerings. You value security so you’ve ensured that your website is https (hypertext transfer protocol secure) in order to provide a secured communication over the digital network. But is this enough to safeguard this highly sensitive personal data, which your customers are using in online and offline sales? Have you set up adequate protocols to thwart any malware or hacker attempts? Or do you believe this isn’t something you need to worry explicitly about because… well your site is https. “Secure” is built into the acronym, so what could possibly go wrong? A lot actually, including the possibility of a hefty fine particularly if your clientele are residents within the EU or UK. So we highly implore you to take a detailed look into your company’s safeguards least you find yourself in hot water, much like a UK Retailer, DSG Retail Limited (DSG) who has been fined half a million pounds by the ICO for failing to keep personal information secure.
A January 9, 2020 ICO news article explains that an ICO investigation revealed that an attacker had installed malware on 5,390 tills at DSG’s Currys PC World and Dixons Travel stores between July 2017 and April 2018, and had collected personal data for the nine month period before the attack was detected. DSG’s inadequate security systems therefore resulted in unauthorized access of some 5.6 million payment cards details and the personal information of approximately 14 million people, including full names, post codes, email addresses and failed credit checks from internal servers, the ICO further notes.
“Our investigation found systemic failures in the way DSG Retail Limited safeguarded personal data. It is very concerning that these failures related to basic, commonplace security measures, showing a complete disregard for the customers whose personal information was stolen . . . The contraventions in this case were so serious that we imposed the maximum penalty under the previous legislation, but the fine would inevitably have been much higher under the GDPR,” ICO Director of Investigations, Steve Eckersley, is quoted in the news article.
The £500,000 ICO fine was levied under the Data Protection Act 1998 since the breach took place before the GDPR and DPA 2018 came into effect. Security of Processing is covered under article 32 of the GDPR.
A leaked EU Commission white paper proposes that the EU place a 3 to 5 year ban on facial recognition technology within public places.
As exciting as it all seems—this ability to instantly gain access, perform transactions or even pay bills by simply scanning your face!—there is without a shadow of a doubt, a scary dark side. The fact that in today’s world there is an app which allows others to compare a photo to a database of more than 3 billion photos in order to identify an individual paints a vivid picture of the privacy risks associated with facial recognition.. It is no surprise then that the EU Commission is considering a placing a ban on facial recognition technology.
An Euractiv news report explains that the European Commission is considering measures to impose a temporary ban on facial recognition technologies used by both the public and private sector. This proactive suggestion is presented in an EU draft white paper obtained by Euractiv.
Euractiv expounds that the Commission paper, which gives an insight into proposals for a European approach to Artificial Intelligence, stipulates that a future regulatory framework could “include a time-limited ban on the use of facial recognition technology in public spaces.” It is further proposed that during the ban, a sound methodology for assessing the impact of facial recognition and possible risk management measures could be identified and developed.
Facial recognition falls under the umbrella of biometric data and must therefore be collected, handled and saved in accordance with the GDPR and the UK’s Data Protection Act. Aphaia, presents a more detailed view of the GDPR’s stipulations concerning facial recognition here.
Does your company utilize biometric data such as fingerprinting, voiceprinting and facial recognition? If yes, failure to adhere fully to the guidelines and rules of the GDPR and Data Protection Act 2018 could result in a hefty financial penalty. Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. Contact us today.