Impersonation feature on company platforms

The Reality of the Impersonation Feature on Company Platforms.

Many company platforms and apps include an impersonation feature which allows administrative users to access accounts as though they were logged in as the users themselves.

Imagine knowing that by simply having an account with a company, you are unknowingly granting access to this company’s everyday employees to access your data in just the same way that you would, had you logged in with your username and password. Such is, or has been the case with many companies that we all use on a regular basis. The truth is that there are “user impersonation” tools built into the software of many tech companies like Facebook and Twitter, which not only allow employees to access your account as though they have logged in as you, but also this could be happening without your knowledge. The account holder, or user is typically not notified when this happens, nor is their consent needed in order for this to happen. According to a recent article on OneZero, “…these tools are generally accepted by engineers as common practice and rarely disclosed to users.” The problem is that these tools can be, and have been misused by employees to access users’ private information and even track the whereabouts of users of these companies’ platforms.

The Fiasco Surrounding Uber’s “God mode” Impersonation Feature.

In recent years, the popular transport company, Uber has come under fire for its privacy policies, and in particular, its questionable impersonation features, known as “God mode”. Using the feature, the company’s employees were able to track the whereabouts of any user. Uber employees were said to have been tracking the movements of all sorts of users from famous politicians to their own personal relations. After being called to task by US lawmakers, the company apologized for the misuse of this feature by some of its executives and stated that it’s policies have since been updated to avoid this issue in the future. Uber is not unique to this sort of privacy breach. Lyft is also known to have comparable tools, along with several other companies.

Impersonation Features Form Part of Most Popular Programming Tools.

Impersonation Feature use is much more widespread than just a few known companies. Popular programming languages like Ruby on Rails and Laravel offer this feature, which has been downloaded several million times. The impersonation tools offered by these services do not usually require users’ permission, nor do they notify users that their account has been accessed. It is pretty common for developers to simply white list users with administrator access giving them access to impersonator mode, thereby allowing them to access any account as though they were logged in as that user.

How Impersonation Features Can Be Made Safer.

Some companies have made changes to their policies and procedures in order to make impersonation features safer for customers. For example Uber, following their legal troubles over the ‘ God mode’ feature, have made it necessary for their employees to request access to accounts through security. Other companies have resolved to require the user to specifically invite administrators in order to grant them access.

According to Dr Bostjan Makarovic, Aphaia’s Managing Partner, “Whereas there may be legitimate reasons to view a profile through the eyes of the user to whom it belongs, such as further app development and bug repair, GDPR requires that such interests are not overridden by the individual’s privacy interests. This can only be ensured by means of an assessment that is carried out prior to such operations.”

Does your company use impersonation features and want to be sure you are operating within GDPR requirements? We can help you. Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

ICO fines CDRNN Ltd

The ICO Imposed the Maximum fine of £500,000 on Scottish Company, CRDNN Ltd for Automated Nuisance Calls.

The ICO has recently imposed the maximum fine of £500,000 on a Scottish company, CRDNN Ltd for making nearly 200 million automated nuisance calls.

After receiving over 3000 complaints about CRDNN Ltd, formerly known as Contact Reach Digital Ltd, the ICO launched an investigation which resulted in a fine of £500,000 for unlawful marketing in the form of automated nuisance calls. Of those calls made, over 63.5 million connected. Some were even made to Network Fall’s Banavie Control Centre, clogging the line meant for drivers and pedestrians at unsupervised rail crossings, potentially putting lives at risk.

The investigation was launched after a raid by the ICO, of the company’s headquarters in Clydebank where computer equipment and documents were seized. The investigation revealed that over 1.6 million calls per day were being made between June 1st and October 1st of 2018. The calls were for the purpose of direct marketing and they were made from so-called ‘spoofed’ numbers. This means that people who received the calls could not identify who was making them, which is against Article 14 GDPR.

In a statement by the ICO’s head of investigations, Andy Curry, he reveals that not only were these calls unsolicited, but consumers who attempted to opt out of those calls were simply bombarded with even more as a result. Mr Curry goes on to explain that CRDNN incurred the maximum fine due to the fact that the company’s directors “knowingly operated the business with complete disregard for the law” and did all in their power to avoid detection, even going as far as transferring the operation abroad, and attempting to liquidate.

The ICO has issued an enforcement notice to the CRDNN Ltd, ordering them to comply with the privacy and electronic communications regulations laws within 35 days of their receipt of this notice. This enforcement notice, issued on February 26th 2020, states that CRDNN’s actions violated regulations 19 and 24 of PECR.

We recently reported on two fines issued by the Italian DPA (Garante) on TIM Spa ,and Eni Gas E Luce, for Euro 27.8 million and 11.5 million respectively. The ICO has now taken a stand against data mismanagement with this new fine. With officials cracking down on companies which mismanage their data, it is imperative that companies ensure that they are in line with the GDPR, PECR 2013, and the DPA 2018.

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR and UK Data Protection Act? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Data Protection Laws and Anti-Money-Laundering

Banks Ask For Guidance in Balancing Data Protection Laws and Anti-Money-Laundering Requirements.

Representatives from the European Banking Industry ask for more legal guidance on how data protection laws should be interpreted, specifically in the Anti-Money-Laundering (AML) realm.

European banking industry representatives are asking for assistance and more legal guidance, amid their claims that there seems to be some tensions between the objectives of the GDPR and Anti Money Laundering (AML) procedures. While they admit that the GDPR, which has been in effect from May 2018, is a great regulatory initiative to protect the privacyrights , it may also be protecting the privacy of criminal networks. Representatives from the European Banking Industry ask for more inclusive and pragmatic guidance on the interpretation of the GDPR laws in the AML realm.

Wim Mijs, CEO of the European Banking Federation, while speaking at an event in Brussels on Feb. 19 had this to say: “We have the GDPR in Europe and it is a great regulation that protects the privacy of citizens. But when it leads to the protection of criminal networks, something is wrong. In my view, the GDPR gives the opportunity to do good law enforcement and exchange of information, but it’s lacking,”

The Example of Denmark’s AML Task Force

Denmark’s Money-Laundering Task Force, established in 2018 in light of revelations that their country’s largest financial institution, had been involved in a dirty money scandal, recently underwent a year’s work with experts, lawyers and representatives from the largest banks in Denmark. The task force found that the implementation of EU data privacy regulations could impede the banks’ abilities to efficiently combat money laundering. However, the GDPR allows for the processing of data when “necessary for compliance with a legal obligation.” These include “ know your customer” and other AML-related regulation, as indicated by Emmanuel Plasschaert, a Brussels-based lawyer at Crowell & Moring, specializing in GDPR and AML. He explained in an interview, that despite possible friction between the two sets of regulation, banks do have the flexibility under GDPR, to process data in their AML efforts.

Banks Need Further Guidance.

Whereas European banks filter and flag money laundering activities through their typical AML processes, their representatives seem to feel that this is too difficult when they also have to consider their legal obligations under the GDPR.While there may indeed be a healthy balance, the bank representatives require additional legal support in finding it. Roger Kaiser, senior policy adviser on fiscal and AML at the European Banking Federation, claims that banks process data that is “not strictly required by legal obligations,” In those cases, it is unclear what is permitted under GDPR. Kaiser is calling for “inclusive and pragmatic guidance on how to interpret the GDPR in an AML context.” He believes that should be developed together with the European Banking Authority, which is the agency responsible for ensuring consistent and effective application of the EU’s AML directive.

According to Dr Bostjan Makarovic, Aphaia’s Managing Partner, “all AML measures clearly fall under legitimate interest but a guidance might provide for clearer answers as to the proportionality of such legitimate measures from the individuals rights’ point of view”.

Do you need help with GDPR requirements in AML procedures? We can help you. Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

EGL Fined for Unlawful Marketing

Italian DPA (Garante) Imposes a Double Fine on Eni Gas E Luce Totalling EUR 11.5 Million for Two Violations of the GDPR.

The Italian Data Protection Authority (Garante) imposed a double fine on Eni Gas E Luce (EGL) of EUR 11.5 million for unlawful data processing for promotional purposes and activation of unsolicited contracts.

Last month, the European Data Protection Board reported on a double fine imposed on Eni Gas E Luce, by the Italian Data Protection Authority. Following an investigation into the marketing practices of Eni Gas E Luce (EGL), the Italian Data Protection Authority imposed a total fine of EUR 11.5 million for unlawful data processing for promotional purposes and activation of unsolicited contracts. Of the two fines imposed on EGL, the first, was a fine of EUR 8.5 million, for processing in connection with telemarketing and teleselling activities and the other,of EUR 3 million, for breaches due to the conclusion of unsolicited contracts for the supply of electricity and gas under ‘free market’ conditions.


Unlawful Data Processing

Of the several infringements uncovered during the investigation, the first fine of EUR 8.5 million were for several counts of unlawful data processing. The specific violations included advertising calls made without the consent of the contacted person or despite that person’s refusal to be subjected to promotional calls, or without the required procedures for verifying the public opt-out register. The Italian DPA also found that there were no technical and organisational measures to take account of the indications provided by users. EGL also had longer than permitted data retention periods; and were acquiring data on prospective customers from list providers who had not obtained any consent for the disclosure of such data.

Activation of Unsolicited Contracts.

After receiving many complaints from customers that they received a letter of termination of the contract with the previous supplier or an initial EGL bill without ever having requested a change in supplier, the Italian DPA conducted an investigation which resulted in an additional EUR 3 million fine. In some cases, customers even reported incorrect data in the contracts and forged signatures.

Corrective and Disciplinary Measures.

The Garante has ordered that, in addition to paying the fine, EGL is to introduce specific alerts in order to detect certain procedural anomalies. The company is also prohibited from using the data made available by the list providers if those providers had not obtained specific consent from consumers, for the communication of such data to EGL. EGL is also expected to verify the consent of the persons included in the contact lists prior to the start of any promotional campaigns.They are to do so by examining a large sample of customers, and all of the aforementioned measures have to be implemented and communicated to the Italian DPA within a set timeframe, while fines must be paid within a 30 day period.

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR and UK Data Protection Act? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.