Dutch DPA Imposed a Fine

The Dutch DPA Imposed a Fine on the Dutch Tennis Association under the GDPR for Illegally Selling Personal Data for marketing purposes.

The Dutch DPA Imposed a fine on the Dutch Tennis Association (The KNLTB) of EUR 525,000, for the unlawful sale of personal data of its members to two sponsors.

 

The Dutch DPA recently imposed a fine on the Dutch Tennis Association (KNLTB) under the GDPR, for the illegal sale of their members’ information to two of its sponsors. The information shared included personal data such as their names, addresses and genders. This information was then used by the two sponsors, to market offers to these individuals by both phone, and the post. One sponsor purchased the information of 50,000 members, while the other sponsor purchased the data of over 300,000 members. While the KNLTB argued that it had legitimate interest in selling its members data, the Dutch DPA does not agree and believes that financial gain was the basis of the KNLBT’s decision to infringe on the basic rights of its members under the GDPR, by selling their data. 

 

Previous Fines by the Dutch DPA.

 

The Dutch DPA had, prior to this most recent fine on the Dutch Tennis Association, imposed two fines under the GDPR. The first of which was ruled against the Dutch UWV (Employee Insurance Agency) in 2018. As a result of the fine the UWV was required to improve its logging security level by October 2019, however this has now been postponed by a year, which could carry a fine of EUR 150,000 per month, up to a total of EUR 900,000. The second fine, imposed on the Dutch Haga Hospital, was because of the insufficiency of their internal security of patient records, resulting in approximately 200 employees having unauthorized access to medical records of a Dutch celebrity, and this person’s private, personal information being leaked to the press. For this, the Dutch DPA imposed a fine of EUR 460,000.

 

On another note, the DPA has launched an investigation in the past into Facebook’s failure to adequately inform users that their data was being used for targeted advertising. This did not result in a fine, but did inspire a change in Facebook’s personal data policy. 

 

The Dutch DPA’s Policies for Determining Administrative Fines. 

 

In an effort to maintain consistency in the fines it imposes, the Dutch DPA has specific policies for determining the level of these administrative fines. Infringements are divided into categories, determined by the relative GDPR article. As reported by the INPLP in their article, the fines imposed based on this policy can be increased or reduced, depending on the following relevant factors: 

 

  • The nature, severity and duration of the infringement, taking into account the nature, scope or purpose of the processing in question, the number of persons affected and the extent of the damage suffered by them.
  • The deliberate or careless nature of the infringement.
  • The measures taken by the controller or the processor to limit the damage to the data subjects involved.
  • The extent to which the controller or the processor is responsible, considering the technical and organizational measures that had to be taken under articles 25 and 32 of the GDPR. 
  • Previous infringements, where relevant, by the controller or the processor.
  • The level of cooperation with the Dutch DPA to remedy the infringement and reduce the possible, negative consequences of it.
  • The categories of personal data affected by the infringement.
  • The manner in which the Dutch DPA has been notified of the infringement and whether the controller or the processor has reported the infringement.
  • In how far the controller or the processor has complied with any previous measures imposed by the Dutch DPA, as referred to in article 58 (2) of the GDPR.
  • Compliance with approved codes of conduct in accordance with article 40 of the GDPR or with approved certification mechanisms referred to in article 42 of the GDPR.
  • Any other circumstances that may be regarded as aggravating or mitigating factors, such as financial gains realised, or losses avoided, whether or not directly arising from the infringement.

 

Their general guide for imposing fines it’s based on the following categories, as determined by the corresponding GDPR infringement:

 

Category Range of Fines  Standard Fine
I €0 to €200,000 €100,000
II €120,000-€500,000 €250,000
III €300,000-€750,000 €525,000
IV €450,000-€1,000,000 €725,000

 

The fine imposed on the Dutch Tennis Association, KNTLB, was based on a category III infringement and therefore incurred the basic fine for that category; €525,000. So far this year, we reported on two fines issued by the Italian DPA (Garante) on TIM Spa ,and Eni Gas E Luce, for Euro 27.8 million and 11.5 million respectively, and more recently, on CRDNN Ltd, of half a million pounds, by the UK’s DPA, the ICO. 

 

With officials cracking down on companies which mismanage their data, it is imperative that companies ensure that they are in line with the GDPR, PECR 2003, and the DPA 2018. While this is only the third fine being imposed by the Dutch DPA under the GDPR, the Dutch DPA is the first in the EU to define its own policy for imposing fines, which may inspire other countries to do the same. 

 

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR and Data Protection Act 2018? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Impersonation feature on company platforms

The Reality of the Impersonation Feature on Company Platforms.

Many company platforms and apps include an impersonation feature which allows administrative users to access accounts as though they were logged in as the users themselves.

Imagine knowing that by simply having an account with a company, you are unknowingly granting access to this company’s everyday employees to access your data in just the same way that you would, had you logged in with your username and password. Such is, or has been the case with many companies that we all use on a regular basis. The truth is that there are “user impersonation” tools built into the software of many tech companies like Facebook and Twitter, which not only allow employees to access your account as though they have logged in as you, but also this could be happening without your knowledge. The account holder, or user is typically not notified when this happens, nor is their consent needed in order for this to happen. According to a recent article on OneZero, “…these tools are generally accepted by engineers as common practice and rarely disclosed to users.” The problem is that these tools can be, and have been misused by employees to access users’ private information and even track the whereabouts of users of these companies’ platforms.

The Fiasco Surrounding Uber’s “God mode” Impersonation Feature.

In recent years, the popular transport company, Uber has come under fire for its privacy policies, and in particular, its questionable impersonation features, known as “God mode”. Using the feature, the company’s employees were able to track the whereabouts of any user. Uber employees were said to have been tracking the movements of all sorts of users from famous politicians to their own personal relations. After being called to task by US lawmakers, the company apologized for the misuse of this feature by some of its executives and stated that it’s policies have since been updated to avoid this issue in the future. Uber is not unique to this sort of privacy breach. Lyft is also known to have comparable tools, along with several other companies.

Impersonation Features Form Part of Most Popular Programming Tools.

Impersonation Feature use is much more widespread than just a few known companies. Popular programming languages like Ruby on Rails and Laravel offer this feature, which has been downloaded several million times. The impersonation tools offered by these services do not usually require users’ permission, nor do they notify users that their account has been accessed. It is pretty common for developers to simply white list users with administrator access giving them access to impersonator mode, thereby allowing them to access any account as though they were logged in as that user.

How Impersonation Features Can Be Made Safer.

Some companies have made changes to their policies and procedures in order to make impersonation features safer for customers. For example Uber, following their legal troubles over the ‘ God mode’ feature, have made it necessary for their employees to request access to accounts through security. Other companies have resolved to require the user to specifically invite administrators in order to grant them access.

According to Dr Bostjan Makarovic, Aphaia’s Managing Partner, “Whereas there may be legitimate reasons to view a profile through the eyes of the user to whom it belongs, such as further app development and bug repair, GDPR requires that such interests are not overridden by the individual’s privacy interests. This can only be ensured by means of an assessment that is carried out prior to such operations.”

Does your company use impersonation features and want to be sure you are operating within GDPR requirements? We can help you. Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

TIM Fined for Unlawful Marketing

Garante Fines TIM SpA EUR 27.8 Million for Unlawful Marketing.

The Italian Data Protection Authority (DPA) Garante fined TIM SpA EUR 27,802,496 for several instances of unlawful data processing for marketing purposes.

Complex investigations were carried out after the DPA received hundreds of complaints, from January 2017 to early 2019 regarding unlawful processing for marketing purposes, in particular, unsolicited marketing calls that had been performed without any consent, from call centers acting on behalf of TIM S.p.A. In some cases, the concerned parties either had denied their consent to receive marketing calls or were part of the public opt-out register. Some complaints also mentioned unfair prize competition processes and the applicable forms, among other issues. The investigations were carried out with the aid of a specialised unit of the Italian Financial Police and revealed several critical infringements of personal data protection legislation.

Unlawful ‘cold’ marketing calls

TIM SpA, Italy’s largest telecommunications service provider, was found to have had marketing calls placed to millions of consumers, by various call centers, on their behalf, to ‘non-customers’, without their consent. There were also calls made to several customers who are on a marketing black list. Furthermore, over two hundred thousand numbers were called, which were not included in TIM’s list of marketing numbers. According to the European Data Protection Board “Other types of illicit conduct were also found such as TIM’s failure to supervise the activities of some call centres or to properly manage and update their blacklists (listing individuals who do not wish to receive marketing calls), and the fact that consent to marketing activities was mandatory in order to join the ‘Tim Party’ incentive discount scheme.”

Measures issued by the Italian DPA

In addition to imposing fines on TIM, the Italian DPA also imposed certain injunctions and prohibitions. The injunctions require TIM to check the consistency of their blacklists, and to allow customers to access discount schemes and prize competitions without having to consent to marketing interactions. Also, TIM will have to check the app activation procedures; and always specify, in clear and understandable language, the processing activities they perform along with the purposes and the relevant processing mechanisms. They are to make sure they obtain valid consent. In addition, the company is no longer allowed to use customer data collected through their three apps; ‘MyTim’, ‘TimPersonal’ and ‘TimSmartKid’ for any purposes other than to provide the relevant services without the users’ free, specific consent. This is only part of the total of 20 corrective measures imposed on TIM by the Italian DPA, which must all be implemented and the progress thereof, reported to the Italian SA according to a specific timeline, in addition to having to pay the Euro 27.8 million fine within 30 days.

Should our business be worried?

One should keep in mind that the rules on ‘cold calls’ vary from country to country, even within the GDPR framework. It is therefore important to consult an expert before deciding to engage in cold marketing calls or cold emailing. The latter is generally prohibited in all the EU Member States and the UK, with some exceptions.

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR and UK Data Protection Act? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Data sharing code

ICO launches new Data Sharing Code in line with GDPR and DPA 2018

The ICO’s updated Data Sharing Code will provide companies with practical guidelines about how to share personal data in compliance with data protection legislation.

In today’s highly digital, increased-efficiency focused era, data sharing undoubtedly plays a significant role. Indeed major technological shifts in how organizations do business present pretty persuasive arguments for the need for data sharing. Just as prevalent however are the related privacy concerns.

For public and private organizations alike, the balancing act of sharing data without compromising sensitive personal information is vital. Not to mention the need to ensure compliance with GDPR and the Data Protection Act 2018.

The good news is that the update to the ICO data sharing code  of practice is well on its way to being finalized.

Prepared under section 121 of the Data Protection Act 2018, the updated ICO data sharing code—currently in draft—will serve as a practical guide for organisations about how to share personal data in compliance with data protection legislation.

As noted in the draft code summary, the code explains the law and provides good practice recommendations. As such, “following it along with other ICO guidance will help companies manage risks; meet high standards; clarify any misconceptions organisations may have about data sharing; and give confidence to share data appropriately and correctly.”

According to the ICO the code will also address many aspects of the new legislation including transparency, lawful bases for processing, the new accountability principle and the requirement to record processing activities.

It is also important to note that in accordance with section 127 of the DPA the ICO will take the code into account when considering whether organisations have complied with their data protection obligations in relation to data sharing. In particular, the Commissioner will take the code into account when considering questions of fairness, lawfulness, transparency and accountability under the GDPR or the DPA. The code can also be used in evidence in court proceedings, and the courts must take its provisions into account wherever relevant.

Public consultations of the draft data sharing code was launched in July and came to an end on September 9th. The draft code is now expect to be approved by Parliament before it becomes a statutory code of practice.

 

Do you require assistance with GDPR and Data Protection Act 2018 compliance? Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing.