Millions of exposed records caused by misconfigured Power Apps from Microsoft include health related data.
Over a thousand misconfigured web apps have resulted in millions of exposed records. An estimated 38 million records were reportedly exposed online. While there is no evidence that the exposed records were accessed by anyone, investigative research uncovered the fact that these records, which included lots of sensitive personal data, were readily accessible online.
Researchers discovered that the default settings for Power Apps were making data publicly accessible.
Researchers at an organization known as Upguard found one misconfigured app while enabling APIs, and noticed that the settings defaulted to making the data publicly accessible. Upon further inspection, they discovered that thousands more of the apps were similarly misconfigured, leaving the personal records of millions of data subjects available online. These records included phone numbers, home addresses, social security numbers and even COVID-19 vaccination status. This misconfiguration has affected several large companies and organizations, a testament to the far reaching consequences of this manner of incident. Although there is no evidence that these records were accessed by unauthorized persons, this situation is an attestation to the importance of ensuring privacy settings are as they should be, particularly with regard to cloud storage apps.
Misconfiguration is a common issue with cloud based platforms, and many major companies have taken steps to secure privacy.
The exposed data was all stored in Microsoft’s PowerApps portal service, a cloud based development platform that makes it easy to create web or mobile apps for external use. When it comes to cloud based platforms, misconfiguration is a common issue. Many major cloud companies like Amazon Web Services, Google Cloud Platform, and Microsoft Azure have all taken steps to ensure that customers’ data is stored privately by default, and to flag potential misconfigurations, but until fairly recently, the industry as a whole didn’t necessarily prioritize this issue.
Once Microsoft was informed of the issue of misconfiguration on their platform, they took immediate steps to correct it, and to alter their default settings.
Researchers at Upguard, the organization which discovered the misconfigured settings immediately took action. Upguard observed the extent of the exposures and notified as many affected organizations as possible. Due to the sheer reach of the damage, researchers couldn’t get to every entity. They then also disclosed the findings to Microsoft. After being informed of the issue in this instance, Microsoft immediately took steps to correct it. Earlier this month, Microsoft announced that Power Apps portals will now default to storing API data and other information privately. The company has also released a tool that customers can use to check their portal settings on their end.
Does your company utilize or offer cloud based storage? Does your company have all of the mandated safeguards in place to ensure compliance with the GDPR, Law Enforcement Directive and Data Protection Act 2018? Aphaia provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today
A Vienna based company incurred a GDPR fine of €2 million for the unlawful collection and processing of user data.
The Vienna based company was found to have violated several GDPR guidelines.
Unser Ö-Bonus Club GmbH was found to have violated a number of guidelines, including unlawful user data collection, insufficient acquisition of consent, unlawfully processing personal data for profiling consumers, and continuation of violation after admission. The violations concern Articles 6, 7, 12, and 13 of the GDPR. According to the GDPR, businesses processing personal data can do so only if the processing and its purposes are legal. Also, companies collecting personal data after consent should be able to demonstrate – whenever required – that they have obtained consent for the specific purposes for which the data was collected. GDPR further requires that notice of collection should be given at the data collection point and that nothing should be hidden from the users with regards to their data.
The company incurred a heavier fine because it continued to use unlawfully collected data after admittance to the violations.
After the company admitted to the violations during the investigation, they continued to handle the data which was unlawfully collected. Although the company amended the form, it continued to unlawfully use the collected personal data, from the previous form, which was deemed inadequate. The company blamed the Austrian Data Protection Authority for not informing them that their continued use of that data was deemed unethical and unlawful. However, the Authority concluded that an additional fine would be applied for that violation as well, bringing the total fine to €2 million.
The ICO has outlined the Children’s Code standards on use of their online data.
The ICO has recently put the limelight on the Children’s Code standards to give greater clarity to organizations regarding their use of children’s data. For online businesses, there are a few important points to keep in mind. In all actions taken by organizations operating online, children’s best interest should be given prime consideration per the United Nations Convention on the Rights of the Child (UNCRC). Children’s Code standards include several guidelines on how best to collect and handle children’s data, as well as provide the best level of protection for children against abuse and other forms of exploitation.
The Children’s Code standards outline several measures which must be taken in order to protect children and their data.
Children should be kept safe from any commercial exploitation, including personalized features for revenue, personalized advertisement, use of children’s data for monetization, and age-inappropriate and fraudulent products. Businesses should also abide by standards set by the Committee of Advertising Practice. Children should always be protected from abuse when they interact with others. These organizations are expected to prevent all automated data sharing on children, to ensure that their data will not end up in the hands of any exploitative individual or organization. It is recommended to use high privacy settings and make sure that children are informed on how their data is being used.
Children should remain protected from misinformation, while understanding and controlling the information they share with others.
Moreover, according to UNCRC, children have the right to unbiased information so they can ensure that their best interests remain protected. They should be protected from misinformation. In addition to these, the UNCRC also provides for the right to play for children. Children’s data can therefore be used for improving child-friendly gameplay. However, children should have the freedom to join and leave groups on their own will and without consequence. Children should be aided in understanding and controlling the information they share with others.
Organizations are required to remain up to date with guidance from the ICO with regards to dealing with children’s data.
All online organizations and services are prohibited from using children’s personal data in a way that is detrimental to their wellbeing. For this purpose, they are required to conform to all detrimental use standards set under UK GDPR, industry code, regulatory provisions, and government advertisement. They are expected to give timely updates. Online businesses, before marketing, should ensure that they follow all guidelines issued by the ICO as well. Businesses are expected to implement various codes of practice and the relevant provisions from the ICO, from time to time, outlined in its blogs.
Companies who collect data from children should adhere to the principle of data minimisation and only use data in the ways in which it was intended.
When collecting data from children, online businesses are also expected to collect only the least amount of data that serves the purpose for the limited time frame. The collection should be isolated for each element of service while the children should be empowered to decide which element they want. Companies should avoid using data for other purposes particularly when it is used for personalized user experience.
Observing these guidelines while handling children’s personal data is imperative to fulfilling basic obligations towards children under the law and avoiding any sanctions.
A recent fine imposed by CNIL of France for €1.75 million relates to two GDPR violations by SGAM AG2R LA MONDIALE.
A recent fine imposed by CNIL on the Mutual Insurance Group company- SGAM AG2R LA MONDIALE for GDPR violations, will cost the company €1.75 million. The company was found to have customer personal data which was kept beyond the legal retention period allowed under the GDPR. In addition, customers contacted by the company by phone were not provided key information required under the GDPR. Following the fine, CNIL also decided to make the decision public. Measures have been taken to achieve compliance, as has been noted by CNIL.
SGAM AG2R LA MONDIALE was found to have been keeping customer data years after customers had been out of contact with the company.
Following an inspection by CNIL, the insurance group was found to have violated article 5(.1) (.e) of the GDPR, by failing to limit the retention period of customer data. There was no implementation of systems to ensure that customer data was not kept beyond the maximum legal retention period, and as result there was data in the company’s records relating to almost 2000 customers who had not been in contact with the company in 3-5 years. There was also a group of over 2 million customers whose personal data, including sensitive health and financial details, was kept beyond the legal retention period allowed after the end of a contract.
The fine imposed by CNIL included a violation of Articles 13 and 14 of the GDPR.
Articles 13 and 14 of the GDPR outline information which must be provided to data subjects when personal data is collected from them (Article 13), and also when personal data has not been collected from them (Article 14). SGAM AG2R LA MONDIALE employed a subcontractor to contact data subjects on its behalf. Upon investigation, it was revealed that the information provided to data subjects by the company’s subcontractor did not include all the necessary elements as required under the GDPR. Data subjects were not given sufficient information regarding the processing of their personal data and other rights. In addition, the data subjects were not given the option of accessing more comprehensive information whether via email or by pressing a key on their phone.
A fine of €1.75 million has been imposed on the company as they take measures to achieve compliance.
CNIL made the decision to impose a total fine of €1,750,000 on SGAM AG2R LA MONDIALE and to make the decision public. There is no indication that the Mutual Insurance Group has contested the fine. The company has in fact, taken measures to come into compliance with GDPR Articles 5(1) (e), 13 and 14 GDPR. The restricted committee of the CNIL has taken note of the compliance measures adopted by the company concerning the limitation of the retention period and the information of data subjects.