Legal basis is required for audio surveillance, according to the Polish SA

The Polish SA says a legal basis is required for audio surveillance and has fined the Warsaw Centre for Intoxicated Persons for a lack thereof.

 

The Polish Supervisory Authority was recently informed that between 2016 and 2021, the Warsaw Centre for Intoxicated Persons recorded sound through its surveillance system, without a legal basis to do so as stated in this report by the EDPB. According to reports, this was carried out throughout their facility through their installed surveillance system. The fact that the sound was recorded for such a long period of time means that the infringement potentially related to a large number of persons. According to the Polish SA, the audio recording of people who are often in the state of intoxication, which hinders them from making conscious statements or having adequate  control over the sounds they make, is excessive.

 

The controller explained that the processing of this data was necessary for a legal obligation with which it complies.

 

When approached by the Polish SA, the controller was asked to indicate its legal basis for collecting personal data in the form of sound. The controller indicated that the processing of the data in this manner was necessary for a legal obligation with which it complies. In addition to the statute of the Centre, the controller referenced regulations included in the Polish Act on Upbringing in Sobriety and Counteracting Alcoholism. Furthermore, the controller explained that its system records both video and audio, and the purpose of processing is, among other things, to exercise continuous surveillance over, and ensure the safety of persons brought into the Centre to sober up. The controller also informed the Polish SA that the audio and video files are stored from 30 to 60 days, except in cases where the recording is secured to be used as evidence in ongoing proceedings.

 

The Polish SA determined that the regulations governing the centre’s activity did not allow them to record audio of  personal data. 

 

In its decision, the Polish SA stated that the controller cannot raise any of the grounds listed in Article 6.1 of the GDPR, particularly Article 6.1(c), which speaks of processing which is necessary for compliance with a legal obligation to which the controller is subject. The Authority also concluded that none of the regulations governing the Centre’s activity would allow the Centre to record audio of personal data as part of its surveillance.

 

A fine of €2200 was imposed on the Warsaw Centre for Intoxicated Persons. 

 

When determining the amount of the administrative fine, the Polish SA took into account the degree of cooperation with the Authority by the controller as a mitigating factor. In addition, as a reaction to receiving notice of the initiation of administrative proceedings, the controller ceased the processing of data until the decision was issued, and erased all the registered data, showing consideration for the rights of data subjects. As a result, an administrative fine of €2200 was imposed on the controller.

Does your company have all of the mandated safeguards in place to ensure the safety of the personal data you collect or process? Aphaia can help. Aphaia also provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

Digital Markets Act and Digital Services Act officially approved in the EU

The digital markets act and digital services act have officially been approved in the EU and are being implemented.

 

EU lawmakers recently approved the Digital Markets Act (DMA), and Digital Services Act (DSA), which will help control unfair advantage by tech giants such as Google, Amazon, Apple, Facebook and Microsoft.  Companies may now face fines of up to 10% of annual global turnover for DMA violations and 6% for DSA violations. Lawmakers and EU states had agreed on a political deal concerning both rule books earlier this year, however some details were still being ironed out. 

 

The DMA and DSA will put strict rules on the way large tech companies operate, in order to prevent an abuse of power. 

 

The DSA prohibits targeted advertising aimed at children or based on sensitive data for example religion, gender, race and political opinions. The DMA will require businesses to make their messaging services interoperable and to ensure that users have access to their data. Business users on these platforms will now be able to promote competing products and services and close deals with customers off the platforms. Large tech companies will now be prohibited from giving an unfair advantage to their own services over that of their rivals or preventing users from removing pre-installed software or apps. These two rules are expected to affect Google and Apple specifically. Dark patterns, which are tactics that mislead people into giving personal data to companies online, explained in a recent vlog by Aphaia, will also be prohibited.

 

A task force has been established by the European Commission for the implementation of DMA and DSA. 

 

The European Commission has set up a taskforce, with about 80 officials expected to collaborate on the implementation of DMA and DSA. In addition, regulators will establish a European Centre for Algorithmic Transparency to attract data and algorithm scientists to assist with enforcement. Andreas Schwab, the lawmaker who was tasked with steering the issue through the European Parliament, has suggested the use of a larger taskforce to counter Big Tech’s monetary advantage and access to lawyers. Other organizations have also voiced concerns. European Consumer Organisation, BEUC’s Deputy Director General Ursula Pachl said in a statement “We raised the alarm last week with other civil society groups that if the Commission does not hire the experts it needs to monitor Big Tech’s practices in the market, the legislation could be hamstrung by ineffective enforcement.” EU industry chief Thierry Breton has addressed enforcement concerns, saying that various teams would focus on particular issues such as risk assessments, interoperability of messenger services and data access during the process of implementing these rules.

Does your company have all of the mandated safeguards in place to ensure the safety of the personal data you collect or process? Aphaia can help. Aphaia also provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

AI regulatory sandbox: pilot program launched

The Government of Spain and the European Commission recently launched a pilot program for the AI regulatory sandbox. 

 

Last month, the government of Spain and the European Commission presented a pilot of the first regulatory sandbox on Artificial Intelligence. This sandbox aims to bring together the  competent authorities and the companies that develop AI to determine the best practices which will guide the implementation of future AI Regulation by the European Commission (the Artificial Intelligence Act). A timeline of two years has been placed on the implementation of this legislation. According to this report from the European Commission, this initiative by the Spanish government seeks to operationalise the requirements of future AI regulation and other features, including conformity assessments and post-market activities.

 

The AI regulatory sandbox will create an opportunity for innovators and regulators to come together and collaborate.

 

This sandbox presents a way to connect innovators and regulators within a controlled environment to foster cooperation. The aim is to  facilitate the development, testing and validation of innovative AI systems with the mindset of ensuring compliance with the requirements of the AI Regulation. In the interim of preparing for the AI Act, this initiative is expected to clarify easy-to-follow, future-proof best practices and other  guidelines. The results  are expected to facilitate the implementation of rules by companies, particularly SMEs and start-ups. 

 

The results of the pilot will determine guidelines for the implementation and use of AI throughout the European Union.

 

Due to this pilot experience, obligations for AI system providers, and how to implement them will be documented and systematised in the form of good practices and lessons learnt implementation guidelines. This will also include methods to control and follow up, to be used by supervising national authorities in charge of implementing the supervisory mechanisms that the regulation establishes. While this project is being financed by the Spanish government, it will remain open to other Member States, and could potentially become a pan-European AI regulatory sandbox. This is expected to strengthen the cooperation of all possible actors at the European level.  Cooperation at EU level with other Member States will be pursued within the framework of the Expert Group on AI and Digitalisation of Businesses established by the European Commission.

Does your company have all of the mandated safeguards in place to ensure the safety of the personal data you collect or process? Aphaia can help. Aphaia also provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.

GDPR-CARPA certification mechanism adopted by CNPD

Luxembourg adopted the GDPR-CARPA verification mechanism  becoming the first country to introduce a certification mechanism under the GDPR.

 

The National Data Protection Commission of Luxembourg (CNPD) adopted its GDPR-CARPA (Certified Assurance-Report based Processing Activities) certification mechanism last month. This will be known as the first certification mechanism under the GDPR to be adopted on a national and international level. Companies and other organisations established in Luxembourg now have the opportunity to demonstrate that their data processing activities comply with the GDPR. This provides a high level of compliance to the regulation to controllers and processors for their data processing activities which are  covered by the certification. This GDPR certification mechanism does not certify an organisation but rather specific processing operations.

 

The certification in personal data protection Was developed with the help of professional auditors, and also reviewed by the EDPB.

 

The CNPD, as owner of this certification mechanism, will accredit the entities that will issue the GDPR certification. The accreditation criteria was developed by the CNPD, after numerous exchanges the CNPD has had with audit professionals since the GDPR came into effect in 2018. The accreditation is based on ISAE 3000 (audit), ISCQ1 (quality control of auditing organisations) and ISO 17065 (licensing of certification entities). The accreditation criteria highlights the work done by the certification entity and the professional auditors. After the CNPD released its first version of this certification mechanism, other European data protection authorities examined the criteria under the consistency mechanism and the EDPB then issued a formal opinion on GDPR-CARPA. In general, the CNPD has been a driving force behind the progress made by the EDPB in the field of certification. The authority has acted as rapporteur for the adopted guidance or as a help to the EDPB in issuing formal opinions on this novel subject.

 

The implementation of the GDPR-CARPA certification mechanism will help build trust in the processing of the personal data covered by this mechanism.

 

The implementation of a certification mechanism can help promote transparency and compliance to the GDPR. It can also help data subjects to feel assured in the degree of protection offered by products, services, processes or systems used or being offered by the organisations that process their personal data. A unique feature of the CNPD certification mechanism is that it is based on a ISAE 3000 Type 2 report, with the auditor being formally responsible for the implementation of the control mechanism. This offers a guarantee of a high level of confidence, which is key in having the relevant actors and data subjects to build trust in the processing of any personal data covered by this certification scheme.

Does your company have all of the mandated safeguards in place to ensure the safety of the personal data you collect or process? Aphaia can help. Aphaia also provides both GDPR and Data Protection Act 2018 consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.