Apple accused of potential improper data-sharing.
Earlier this month American multinational technology company Apple came under scrutiny for its data-sharing practice of sending IP addresses from users of its Safari browser to Google and Chinese-based tech company Tencent.
Apple has since defended this practice, noting that it is a Safari Fraudulent Warning security feature aimed at flagging websites known to be malicious. In an interview with iMore, Apple reportedly noted that “When the feature is enabled, Safari checks the website URL against lists of known websites and displays a warning if the URL the user is visiting is suspected of fraudulent conduct like phishing. To accomplish this task, Safari receives a list of websites known to be malicious from Google, and for devices with their region code set to mainland China, it receives a list from Tencent. The actual URL of a website you visit is never sharedwith a safe browsing provider and the feature can be turned off.”
It is of note that Apple’s Fraudulent Website Warning setting is automatically set to on. As such users would have to delve into their settings and toggle this off if they do not want to have their IP address forwarded to Google and Tencent when using the Safari browser. It is also reported that toggling this setting to “off” would potentially render browsing sessions less secure.
Potential GDPR and CCPA implications?
Considering that IP addresses can reveal user locations and can also be used to profile users,they are deemed as online identifiers, thus they are personal data as covered by Recital 30 GDPR, which means that this feature would be subject to GDPR compliance.
The recent Cookies Consent ruling by the CJEU, explored in one of our recent blog posts could also potentially affect the way Apple handles its default permission settings.
Moreover, with the California Consumer Privacy Act Regulations (CCPA Regulations)—schedule to take effect on January 1, 2020– introducing consumer rights related third party sharing for companies doing business with California residents; it is likely that Apple would also have to review this practice to ensure CCPA compliance.
Using WhatsApp blue tick to sign contracts? WhatsApp chats have been considered a verbal contract between the parties by a Court in Vigo (Galicia, Spain).
WhatsApp conversations may be a legally binding contract for the parties. An unpaid rent was the origin of this ruling. The landlords sued the tenant and the Court accepted the WhatsApp messages as the valid contract that governed the legal relationship between them. The Court took into account the fact that WhatsApp was the means used by the parties to agree on all the terms of the rent and to share the relevant documents in order to formalise it.
WhatsApp messages as contract and evidence in Court
Article 1278 of Spanish Civil Code states that “contracts will be legally binding for the parties regardless of their verbal or written nature, as long as the essential elements for their validity are met [namely: consent, object and cause]”.
As for the use of WhatsApp messages as a valid evidence in Court, there are, however, some requirements that apply, like the need of experts reports to verify the origin of the communication, the parties identities and the content integrity. Providing the password in order to let the Court access the relevant accounts, allowing access to the device as such or gathering recognition of the existence and truthfulness of the conversation from each of the parties have been accepted by some Courts as evidence enough.
WhatsApp, smart contracts and blockchain
In the light of this ruling, one may wonder if WhatsApp conversations may become one of the “blocks” of blockchain technology and be part of the smart contracts in the future. In order to achieve this, all the messages would need to be sorted and be accessible, maybe with no time limit, for verification purposes. This hypothetical but possible scenario would involve several privacy concerns, because WhatsApp messages may be deemed personal data, thus RGPD and other pieces of legislation, like the one concerning AI, may apply.
Driverless cars are on track to becoming a reality, but what of privacy and data protection? In this blog we explore self-driving cars and GDPR.
When you think of self-driving cars what comes to mind? I’m willing to go on a limb here and say that it’s not just Knight Rider’s KITT, the Batmobile or other cool movie screen concepts. After all, in today’s technological era, driverless cars are no longer confined to our favourite TV shows and movies.
Tesla already asserts that all of their new cars “come standard with advanced hardware capable of providing Autopilot features today, and full self-driving capabilities in the future—through software updates designed to improve functionality over time.” Meanwhile reports indicated that earlier this year Waymo—a self-driving technology development company—signed a deal to make self-driving cars for use in France and Japan.
This certainly gives new meaning to the phrase “the future is now.” Doesn’t it? Yet as exciting as it all is, there is an undeniable dark side which cannot be ignored—data protection concerns, privacy risks and ethical issues.
In one of our recent Youtube vlogs we explored some potential ethical concerns and it really gets one thinking.
Self-driving cars and GDPR
As for the data protection concerns and privacy risks, here’s the skinny. Self-driving cars—much like today’s connected cars—will rely heavily on data collection, analysis and sharing. Since this data will revolve around our individual lives, some will fall within the purview of personal and sensitive data. Understandably, considering the cloud connectedness of it all, hacking will also be a security risk. This all renders self-driving cars and their technology subject to compliance with the GDPR and the Data Protection Act 2018.
Driverless cars are also a good example of the use of IoT, Big Data and AI all together.
“While self-driving cars and GDPR is a concern itself due to its implications for privacy, one should also consider ethical and Telecoms issues,” comments Aphaia Partner Cristina Contero. “On one hand because of the implementation of AI and, on the other, because self-driving cars will be one of the interconnected devices in the smart network of IoT. This also means that they will be part of smart cities, so making the difference between personal data and non-personal data becomes essential too. It might be clear that data gathered from audio or GPS can be considered part of the self-driving cars and GDPR context, but this is not that clear when it comes to other data like weather or road-conditions information.”
The ICO’s updated Data Sharing Code will provide companies with practical guidelines about how to share personal data in compliance with data protection legislation.
In today’s highly digital, increased-efficiency focused era, data sharing undoubtedly plays a significant role. Indeed major technological shifts in how organizations do business present pretty persuasive arguments for the need for data sharing. Just as prevalent however are the related privacy concerns.
For public and private organizations alike, the balancing act of sharing data without compromising sensitive personal information is vital. Not to mention the need to ensure compliance with GDPR and the Data Protection Act 2018.
The good news is that the update to the ICO data sharing code of practice is well on its way to being finalized.
Prepared under section 121 of the Data Protection Act 2018, the updated ICO data sharing code—currently in draft—will serve as a practical guide for organisations about how to share personal data in compliance with data protection legislation.
As noted in the draft code summary, the code explains the law and provides good practice recommendations. As such, “following it along with other ICO guidance will help companies manage risks; meet high standards; clarify any misconceptions organisations may have about data sharing; and give confidence to share data appropriately and correctly.”
According to the ICO the code will also address many aspects of the new legislation including transparency, lawful bases for processing, the new accountability principle and the requirement to record processing activities.
It is also important to note that in accordance with section 127 of the DPA the ICO will take the code into account when considering whether organisations have complied with their data protection obligations in relation to data sharing. In particular, the Commissioner will take the code into account when considering questions of fairness, lawfulness, transparency and accountability under the GDPR or the DPA. The code can also be used in evidence in court proceedings, and the courts must take its provisions into account wherever relevant.
Public consultations of the draft data sharing code was launched in July and came to an end on September 9th. The draft code is now expect to be approved by Parliament before it becomes a statutory code of practice.