TikTok fined by Dutch DPA

TikTok fined by Dutch DPA

TikTok fined by Dutch DPA for failure to provide translated information to users

The video sharing social networking app TikTok was recently fined by the Dutch DPA, according to this report from the EDPB. Upon investigation into apps typically used by minors, it was discovered that the information provided when installing the app (including the privacy policy) was only provided in English. By failing to provide this information in Dutch, TikTok violated the rights of Dutch speaking users, by their failure to give users clear, comprehensible information on what happens with their personal data. This in and of itself is a violation of their privacy rights. TikTok has been hit with a fine for €750,000, to which the company has objected. 

TikTok, fined by the Dutch DPA, and now being investigated by the Irish DPA after establishing headquarters in Ireland. 

While this initial fine was imposed by the Dutch DPA, and rightfully so, because at the time TikTok had no headquarters in the EU, the company has since established headquarters in Ireland. The initial fine could have been imposed by any EU member state, however, any subsequent investigations must be handled by the Irish Data Protection Commission. The Dutch Data Protection Authority can only be expected to assess the privacy statement related violation, which had ended by the time headquarters had been established in Ireland. When companies have no European headquarters, any EU member states can oversee its activities, however if there are European headquarters, this responsibility would fall on the country which houses the company‘s headquarters.

TikTok has made changes to their app to make it safer for child users. 

Since last October, when the Dutch DPA submitted the results of its investigations to TikTok, certain key changes have been made to protect users under 16 while they use this app. While these changes are not entirely foolproof because children can still pretend to be older by creating their account with false information, the DPA welcomes the adjustments made by TikTok to reduce the risk for child users. Partents are now able to manage their children’s accounts through their own accounts, or through the ‘Family Pairing’ feature. This will not prevent children from putting themselves at risk by lying about their age, however it will give parents the power to monitor their children’s accounts and provide greater security to them. 

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR, Law Enforcement Directive and Data Protection Act 2018? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

LinkedIn users’ data for sale

LinkedIn users’ data for sale on hacking forum – 700 million affected

The details of 700 million LinkedIn users were recently posted for sale on a notorious hacking forum. 

 

The details of 700 million LinkedIn users were recently posted for sale on a popular hacking forum. Last month, a user put information for sale on RaidForums, where it was spotted by Privacy Sharks, a news site. The seller provided a sample of 1 million records, which Privacy Sharks viewed and investigated, confirming the validity of the records which included names, gender, phone numbers, email addresses and work information. This is the second instance this year of LinkedIn user information being scraped and posted for sale online. In April, a total of 500 million LinkedIn users were affected in a similar event. 

 

LinkedIn’s investigation revealed that the data was scraped from LinkedIn as well as other other sources. 

 

LinkedIn maintains that this compilation of information of 700 million users was not the result of a data breach, and that the information is all publicly available. The company reported that no private LinkedIn member data was exposed. The ongoing investigation has so far uncovered in an initial analysis, that the data includes information scraped from LinkedIn as well as other sources. LinkedIn has released a statement, stating that they determined that the information which was posted for sale was “an aggregation of data from a number of websites and companies.” The company also states that scraping, and other misuse of members’ data violates its terms of service, and that it will work to stop any entities misusing LinkedIn members’ data, and hold them accountable. 

 

LinkedIn has sought legal action in the past for violation of its terms of service, by data scraping. 

 

While no one has been named as being responsible in this case, LinkedIn is currently in an almost 2-year legal battle to protect its user data and terms of service by seeking litigation over data scraping. In September of 2019, LinkedIn sought legal action against data analytics organization hiQ Labs in the United States Court of Appeals for the Ninth Circuit. At the time, hiQ Labs was found to have been using automated bots to scrape information from public LinkedIn profiles, at which time LinkedIn served them with a cease and desist, claiming that this violated their terms of service. In this case the court ruled that data scraping was legal. The information was all publicly available and was being collected by this data analytics organization. However, LinkedIn once again brought this case before the courts last month, in this instance, going to The Supreme Court. The Supreme Court threw out the lower court’s original ruling, giving LinkedIn another opportunity to plead its case in the 9th circuit. No statement has been made as to whether legal action will also be taken in this instance. 

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR, Law Enforcement Directive and Data Protection Act 2018? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

emergency measures for children’s protection

EU approves emergency measures for children’s protection

Temporary emergency measures for children’s protection have just been adopted by European Parliament.

 

Temporary emergency measures for children’s protection were adopted by European Parliament on July 6th. This regulation will allow electronic communication service providers to scan private online messages containing any display of child sex abuse. The European Commission reported that almost 4 million visual media files containing child abuse were reported last year. There were also 1,500 reports of grooming of minors by sexual predators. Over the past 15 years, reports of this kind have increased by 15,000%. 

 

This new regulation, which is intended to be executed using AI, has raised some questions regarding privacy. 

 

Electronic communication service providers are being given the green light to voluntarily scan private conversations and flag content which may contain any display of child sex abuse. This scanning procedure will detect content for flagging using AI, under human supervision. They will also be able to utilize anti-grooming technologies once consultations with data protection authorities are complete. These mechanisms have received some pushback due to privacy concerns. Last year, the EDPB published a non-binding opinion which questioned whether these measures would threaten the fundamental right to privacy. 

 

Critics argue that this law will not prevent child abuse but will rather make it more difficult to detect and potentially expose legitimate communication between adults. 

 

This controversial legislation drafted in September 2020, at the peak of the global pandemic, which saw a spike in reports of minors being targeted by predators online, enables companies to voluntarily monitor material related to child sexual abuse. However, it does not require companies to take action. Still, several privacy concerns were raised regarding its implementation, particularly around exposing legitimate conversation between adults which may contain nude material, violating their privacy and potentially opening them up to some form of abuse. During the negotiations, changes were made to include the need to inform users of the possibility of scanning their communications, as well as dictating data retention periods and limitations on the execution of this technology. Despite this, the initiative was criticized, citing that automated tools often flag non relevant material in the majority of cases. Concerns were raised about the possible effect this may have on channels for confidential counseling. Ultimately, critics believe that this will not prevent child abuse, but will rather make it harder to discover it, as it would encourage more hidden tactics. 

 

This new EU law for children’s protection is a temporary solution for dealing with the ongoing problem of child sexual abuse. 

 

From the start of 2021, the definition of electronic communications has been changed under EU law to include messaging services. As a result private messaging, which was previously regulated by the GDPR, is now regulated by the ePrivacy directive. Unlike the GDPR, the ePrivacy directive did not include measures to detect child sexual abuse. As a result, voluntary reporting by online providers fell dramatically with the aforementioned change. Negotiations have stalled for several years on revising the ePrivacy directive to include protection against child sexual abuse. This new EU law for children’s protection is but a temporary measure, intended to last until December 2025, or until the revised ePrivacy directive enters into force. 

 

Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR, Law Enforcement Directive and Data Protection Act 2018? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Colorado Privacy Act written into law

Colorado Privacy Act has been written into law, making Colorado the third US state with comprehensive privacy laws. 

 

The Colorado Privacy Act has recently been signed into law, giving comprehensive privacy laws to the residents of Colorado for the first time. Colorado is now the third US State to enact such laws, with theirs being very similar to those which came before it, with a few key differences. Unlike the California Consumer Privacy Act (CCPA), the CPA has adopted a WPA-like controller / processor approach, instead of a business / service provider perspective. This new law is said to look very familiar to this year’s Consumer Data Protection Act (CDPA) in Virginia, with a slightly broader scope. 

 

The Colorado Privacy Act is intended to apply to businesses trading with Colorado residents acting only in an individual or household context. 

 

The CPA applies to any data controller that conducts business in Colorado, as well as delivers commercial products targeted at the residents of Colorado, that meets the following requirements:

 

  • The business controls or processes personal data of at least 100,000 consumers during a single calendar year.
  • The business derives revenue or receives a discount from the sale of personal data, and processes all controls the personal data for at least 25,000 consumers.

 

According to the CPA, “consumer” refers to a Colorado resident, acting only as an individual or in a household context. This omits individuals acting in a commercial or employment context or a beneficiary thereof, or as a job applicant. Like the CDPA controllers, operating under the CPA do not need to consider employee personal data as applicable under this law.

The CPA applies to the exchange of personal data for monetary or other valuable consideration by a controller to a third party. 

 

Under the CPA, both monetary consideration and any other valuable consideration exchanged for personal data is considered the sale of personal information. Unlike the CDPA, the sale is not only defined by the exchange of monetary considerations. The sale described here excludes several types of disclosures. These include disclosures to a processor that is processing personal data on behalf of a data controller, disclosures to a third party for the purpose of providing a product or service requested by a customer, disclosures to an affiliate of the controller’s, as well as disclosures to a third party as part of a proposed or actual merger, acquisition, bankruptcy or another transaction in which the third party controls some or all of the controller’s assets. 

Deidentified data and publicly available information are not covered by the scope of the CPA’s definition of personal data. 

 

The CPA does not cover any publicly available information or deidentified data. The CPA defines publicly available data as “any information that is lawfully made available from … government records and information that a controller has a reasonable basis to believe the consumer has lawfully made available to the general public.” These are both explicitly excluded from the CPA as is the case with the CDPA. Other exempt data under this law falls under two categories, entity-level exemptions and data-level exemptions. The entity level exemptions are broader and exempt controllers from the need to comply with CPA obligations and rights on data collected, even when the data would otherwise be included. For example the primary entity level exemption under the CPA applies to entities which are already regulated by the Gramm-Leach-Blilet Act for financial institutions. 

 

The Colorado Privacy Act provides five main rights to the consumer. 

The CPA provides five main rights for the consumer. These include the right of access, right to correction, right to delete, right to data portability, and the right to opt out. The right of access gives consumers the right to confirm whether a controller is processing personal data concerning them and the right of access to that personal data. Under the CPA consumers are also given the right to correct inaccuracies in their personal data, taking into account the nature of the personal data and the purpose of the processing. Consumers also have the right to delete their  personal data. According to the right to data portability, consumers must be able to obtain their personal data in a portable and readily usable format which allows them to transmit the data to another entity without hindrance, where technically feasible. The CPA also gives consumers the right to opt out of the processing of their personal data for the purposes of targeted advertising, the sale of personal data, or profiling for decision-making that may produce legal or similarly significant effects concerning them.

 

There are several obligations to be fulfilled by controllers and processors under the CPA. 

 

The CPA imposes several obligations on controllers. These include the duties of transparency, purpose specification, data minimization, care, avoidance of secondary use, avoidance of unlawful discrimination, data protection assessments, data processing contracts, and specific duties regarding sensitive data. The CPA requires a controller to provide consumers with a reasonably accessible, clear and meaningful privacy notice. If their data is sold to a third-party or processed for targeted advertising, the controller will have to clearly and conspicuously disclose the sale of processing as well as give consumers the means to opt out. Controllers must specify the express purposes for which they are collecting and processing personal data at the time of the collection of this personal data. The CPA also institutes a policy of data minimization requiring controllers to only collect personal data that is adequate, relevant and limited to what is reasonably necessary for the specified purposes of the collection and processing. In addition, Data controllers are not allowed to process personal data for purposes that are not reasonably necessary to, or compatible with the specified purposes for which it was collected, neither are controllers allowed to process sensitive data without consent. Data protection assessments and contracts are a necessary part of a controller’s obligations under the CPA. The CPA requires that processing must be governed by a contract between the controller and the processor.

 

Does your company have all of the mandated safeguards in place to ensure compliance with the CCPA, CPA, GDPR, Law Enforcement Directive and Data Protection Act 2018? Aphaia provides GDPR ,Data Protection Act 2018 and comparative law consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.