Colorado Privacy Act written into law

Colorado Privacy Act has been written into law, making Colorado the third US state with comprehensive privacy laws. 


The Colorado Privacy Act has recently been signed into law, giving comprehensive privacy laws to the residents of Colorado for the first time. Colorado is now the third US State to enact such laws, with theirs being very similar to those which came before it, with a few key differences. Unlike the California Consumer Privacy Act (CCPA), the CPA has adopted a WPA-like controller / processor approach, instead of a business / service provider perspective. This new law is said to look very familiar to this year’s Consumer Data Protection Act (CDPA) in Virginia, with a slightly broader scope. 


The Colorado Privacy Act is intended to apply to businesses trading with Colorado residents acting only in an individual or household context. 


The CPA applies to any data controller that conducts business in Colorado, as well as delivers commercial products targeted at the residents of Colorado, that meets the following requirements:


  • The business controls or processes personal data of at least 100,000 consumers during a single calendar year.
  • The business derives revenue or receives a discount from the sale of personal data, and processes all controls the personal data for at least 25,000 consumers.


According to the CPA, “consumer” refers to a Colorado resident, acting only as an individual or in a household context. This omits individuals acting in a commercial or employment context or a beneficiary thereof, or as a job applicant. Like the CDPA controllers, operating under the CPA do not need to consider employee personal data as applicable under this law.

The CPA applies to the exchange of personal data for monetary or other valuable consideration by a controller to a third party. 


Under the CPA, both monetary consideration and any other valuable consideration exchanged for personal data is considered the sale of personal information. Unlike the CDPA, the sale is not only defined by the exchange of monetary considerations. The sale described here excludes several types of disclosures. These include disclosures to a processor that is processing personal data on behalf of a data controller, disclosures to a third party for the purpose of providing a product or service requested by a customer, disclosures to an affiliate of the controller’s, as well as disclosures to a third party as part of a proposed or actual merger, acquisition, bankruptcy or another transaction in which the third party controls some or all of the controller’s assets. 

Deidentified data and publicly available information are not covered by the scope of the CPA’s definition of personal data. 


The CPA does not cover any publicly available information or deidentified data. The CPA defines publicly available data as “any information that is lawfully made available from … government records and information that a controller has a reasonable basis to believe the consumer has lawfully made available to the general public.” These are both explicitly excluded from the CPA as is the case with the CDPA. Other exempt data under this law falls under two categories, entity-level exemptions and data-level exemptions. The entity level exemptions are broader and exempt controllers from the need to comply with CPA obligations and rights on data collected, even when the data would otherwise be included. For example the primary entity level exemption under the CPA applies to entities which are already regulated by the Gramm-Leach-Blilet Act for financial institutions. 


The Colorado Privacy Act provides five main rights to the consumer. 

The CPA provides five main rights for the consumer. These include the right of access, right to correction, right to delete, right to data portability, and the right to opt out. The right of access gives consumers the right to confirm whether a controller is processing personal data concerning them and the right of access to that personal data. Under the CPA consumers are also given the right to correct inaccuracies in their personal data, taking into account the nature of the personal data and the purpose of the processing. Consumers also have the right to delete their  personal data. According to the right to data portability, consumers must be able to obtain their personal data in a portable and readily usable format which allows them to transmit the data to another entity without hindrance, where technically feasible. The CPA also gives consumers the right to opt out of the processing of their personal data for the purposes of targeted advertising, the sale of personal data, or profiling for decision-making that may produce legal or similarly significant effects concerning them.


There are several obligations to be fulfilled by controllers and processors under the CPA. 


The CPA imposes several obligations on controllers. These include the duties of transparency, purpose specification, data minimization, care, avoidance of secondary use, avoidance of unlawful discrimination, data protection assessments, data processing contracts, and specific duties regarding sensitive data. The CPA requires a controller to provide consumers with a reasonably accessible, clear and meaningful privacy notice. If their data is sold to a third-party or processed for targeted advertising, the controller will have to clearly and conspicuously disclose the sale of processing as well as give consumers the means to opt out. Controllers must specify the express purposes for which they are collecting and processing personal data at the time of the collection of this personal data. The CPA also institutes a policy of data minimization requiring controllers to only collect personal data that is adequate, relevant and limited to what is reasonably necessary for the specified purposes of the collection and processing. In addition, Data controllers are not allowed to process personal data for purposes that are not reasonably necessary to, or compatible with the specified purposes for which it was collected, neither are controllers allowed to process sensitive data without consent. Data protection assessments and contracts are a necessary part of a controller’s obligations under the CPA. The CPA requires that processing must be governed by a contract between the controller and the processor.


Does your company have all of the mandated safeguards in place to ensure compliance with the CCPA, CPA, GDPR, Law Enforcement Directive and Data Protection Act 2018? Aphaia provides GDPR ,Data Protection Act 2018 and comparative law consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing. We can help your company get on track towards full compliance. Contact us today.


A recent CCPA update

A recent CCPA update: two amendments signed into law.

A recent CCPA update includes two amendments signed into law, with further proposed modifications possible.


The California Consumer Privacy Act (CCPA) is changing  according to this recent article from IAPP. In September, the California Attorney General, Xavier Becerra highlighted the need for privacy law in the United States. His testimony was presented during the Senate Commerce, Science and Transportation Committee hearing, taking advantage of the context of federal legislation on the subject.


The secretary underscored the different approach that his office can take to the application of the CCPA, and also the future that privacy issues should have in law. In addition to the statements, the California Legislature had already approved several bills with privacy implications prior to this update.


Becerra confirmed that since July 1 (compliance date of the CCPA), his office “began to work.” The secretary’s team wants to issue notices to prevent companies with privacy policies from breaking the law. In other words, correct companies that include the “Do Not Sell Links to My Personal Information”. 


In his statements, the secretary mentioned the recent lawsuits against Uber (2018), Equifax (2019), and Glowde September (2020). The Attorney General’s Office recently settled a case with Anthem regarding a 2014 data breach for $8.69 million. 


What changes were made to the CCPA?      


Statements in the written testimony are not limited to a private right of action. The secretary identified other ways to strengthen consumer privacy rights, noting that the CCPA “could go further.” Among the proposals are:


Greater specificity

The secretary suggested making the CCPA’s disclosure requirements more specific. Companies offer “source categories” to collect personal information or “third party categories” to sell the information. Instead, by requiring specific disclosures such as company names, sources, or recipient of the information, consumers can know how much was shared.  


Data minimization

Becerra maintains that the duty should be imposed to use a consumer’s personal information according to the purposes. That is the fines for which the consumer will obtain their collection, always respecting the interests of the person. Especially with sensitive information, such as precise geolocation. 


Right to rectification

A highlight of the CCPA update is the ability to correct for consumers. For example, rectify personal information collected, to reduce the risk of spreading erroneous data. 


Protection of civil rights

There is a need for “clear lines on what is an illegal use of data from the context of the protection of civil rights”. What is important about this is that the testimony provides relevant information from the Attorney General’s perspective. Specifically, on expanding privacy protections for California consumers.


What does the future hold?


There is a pending vote initiative from the California Privacy Rights Act that could boost enforcement. With a tentative date in November, a new enforcement agency could be created. This agency would have $5 million in the fiscal year 2020-21 and another $10 million thereafter. The creation and funding of the California Privacy Protection Agency would go into effect immediately, but most of the CPRA’s practices begin in 2023. 


At the moment the best thing is to see how the landscape of California privacy law progresses, including the activity of CCPA enforcement. Also, it is necessary to be aware of the CPRA voting initiative and the third set of proposed modifications to the CCPA regulations issued by the OAG.


Aphaia can help you comply with CCPA. We offer CCPA implementation as a stand-alone service or together with GDPR, plus other related services such as data protection impact assessments and Data Protection Officer outsourcing.

Age Appropriate Design Code

Age Appropriate Design Code will come into force in less than a month.

Age Appropriate Design Code will come into force September 2nd 2020, and will be ushered in by a 12-month transition period allowing online services time to conform.

The Age Appropriate Design Code which we had initially reported on back in January when the final version of this code was first introduced, has now completed the parliamentary process, and was recently issued by the ICO to come into force on 2nd September 2020. This code of to practice for online services finalised 15 standards laid in Parliament in January of this year. Under section 123 (1) of the Data Protection Act 2018, the Information Commissioner was required to prepare this code which contains guidance on what is considered appropriate on standards of age appropriate design of relevant information society services, which are likely to be accessed by children. 

The Age Appropriate Design Code is a statutory code of practice, providing built in protection for children online.

This code is the first of its kind, is considered, by the Information Commissioner, necessary and achievable, and is expected to make a difference. The Commissioner believes that companies will want to conform with the standards, to demonstrate their commitment to always acting in the best interests of the child. This code, although not expected to replace parental control, should increase confidence in the safety of children, as they surf the internet. The 15 principles of this code are flexible, and are not laws, but rather a statutory code of practice which provides built in protection for children spending time online, ensuring that their best interests are the primary consideration when developing and designing online services. 

The Code lays out 15 Standards, ensuring children’s best interest.

  • The best interests of the child;

The best interests of the child should be a primary consideration when you design and develop online services likely to be accessed by a child.

  • Data protection impact assessments;

 Undertake a DPIA to assess and mitigate risks to the rights and freedoms of children who are likely to access your service, which arise from your data processing. Take into account differing ages, capacities and development needs and ensure that your DPIA builds in compliance with this code.

  • Age appropriate application;

 Take a risk-based approach to recognising the age of individual users and ensure you effectively apply the standards in this code to child users. Either establish age with a level of certainty that is appropriate to the risks to the rights and freedoms of children that arise from your data processing, or apply the standards in this code to all your users instead.

  • Transparency;

The privacy information you provide to users, and other published terms, policies and community standards, must be concise, prominent and in clear language suited to the age of the child. Provide additional specific ‘bite-sized’ explanations about how you use personal data at the point that use is activated.

  • Detrimental use of data;

Do not use children’s personal data in ways that have been shown to be detrimental to their wellbeing, or that go against industry codes of practice, other regulatory provisions or Government advice.

  • Policies and community standards;

 Uphold your own published terms, policies and community standards (including but not limited to privacy policies, age restriction, behaviour rules and content policies).

  • Default settings;

 Settings must be ‘high privacy’ by default (unless you can demonstrate a convincing reason for a different default setting, taking account of the best interests of the child).

  • Data minimisation;

 Collect and retain only the minimum amount of personal data you need to provide the elements of your service in which a child is actively and knowingly engaged. Give children separate choices over which elements they wish to activate.

  • Data sharing;

 Do not disclose children’s data unless you can demonstrate a convincing reason to do so, taking account of the best interests of the child.

  • Geolocation;

Geolocation options should be off by default (unless you can demonstrate a convincing reason for geolocation to be switched on by default, taking account of the best interests of the child). Provide an obvious sign for children when location tracking is active. Options which make a child’s location visible to others must default back to ‘off’ at the end of each session.

  • Parental controls;

If you provide parental controls, the child should be given age appropriate information about this. If your online service allows a parent or carer to monitor their child’s online activity or track their location, you should provide an obvious sign to the child when they are being monitored.

  • Profiling;

Switch all options which use profiling ‘off’ by default (unless you can demonstrate a convincing reason for profiling to be on by default, taking account of the best interests of the child). Only allow profiling if there are appropriate measures in place to protect the child from any harmful effects (particularly, content that is detrimental to their health or wellbeing).

  • Nudge techniques

 There should be no use of nudge techniques to lead or encourage children to provide unnecessary personal data or weaken or turn off their privacy protections.

  • Connected toys and devices

 If your company provides a connected toy or device, you should ensure that you include effective tools to enable conformance to this code.

  • Online tools. 

 Provide prominent and accessible tools which will help children exercise their data protection rights and report concerns.

The code will apply to any product or service likely to be accessed by children, and not just those aimed at children.

The standards laid out in this code applies to any company or institution providing products or services (including apps, programmes, websites, games or community environments, and connected toys or devices with or without a screen) not just aimed at children, but likely to be accessed by children, and which process personal data in the UK. Due to increasing concern about the position of children in the modern digital world and in the wider society, the general consensus in the UK and internationally is that more needs to be done to create a safe space for them to learn, explore, and play online. The purpose of this code is not to protect children from the digital world but to protect them within that space. The code takes account of the standards and principles set out in the UNCRC, and sets out specific protections for children’s personal data in compliance with the GDPR.

This code, which comes into effect next month, must support children’s rights.

This code is due to come into effect on September 2nd, 2020 as announced by the ICO this week. That date will begin the 12 month transitionary period, during which companies are expected to take steps towards full compliance, ensuring that all principles are considered and that their services use children’s data in ways that support the following rights of the child;

  • Freedom of expression.
  • Freedom of thought, conscience and religion.
  • Freedom of association.
  • Privacy.
  • Access information from the media (with appropriate protection from information and material injurious to their well-being).
  • Play and engage in recreational activities appropriate to their age.
  • Protection from economic, sexual or other forms of exploitation. 

Failure to conform to these standards could result in assessment notices, warnings, reprimands, enforcement notices and penalty notices (administrative fines). As a result, data protection impact assessments are suggested to ensure compliance.

Does your company offer online services likely to be accessed by minors? If so, it will be imperative that you adhere to the UK Data Protection Code once it is effected. Aphaia’s data protection impact assessments and Data Protection Officer outsourcing will assist you with ensuring compliance. Aphaia provides GDPR adaptation consultancy services and CCPA compliance, including EU AI Ethics assessments. Contact us today.

Biometric Information Privacy Act

The National Biometric Information Privacy Act introduced by US Senator Jeff Merkley.

The National Biometric Information Privacy act of 2020, was introduced last week by US Senator Jeff Merkley, accompanied by Senator Bernie Sanders.


The National Biometric Information Privacy Act, a groundbreaking legislation prohibiting private companies from collecting and profiting off biometric data without consumers and employees consent, was recently introduced by US Senator Jeff Merkley. In this context biometric data would include eye scans, fingerprints, voiceprints and faceprints and any other uniquely identifying information, based on the characteristics of an individual’s gait or other immutable features. The introduction of this bill is due to increased concern about the widespread use of biometric data collection within private companies and the implications of the use of that. As an example of the effects of implementing this technology without the relevant ethical principles in place, we find that, according to a study released in December 2019, Asian and Black individuals were up to 100 times more likely to be misidentified by facial recognition technology. This has brought about much concern about the consequences of the use of this technology for store surveillance.


This bill limits the use of individuals’ biometric data by companies, without their consent.


This bill aims to limit a company’s ability to collect, buy, sell, lease, trade or retain individuals’ biometric information without specific written consent, and will also require private companies to disclose the information they have collected on someone to them, should they inquire. Any company which fails to comply, if this bill is passed, would be able to have lawsuits brought against them by the individuals violated. 


Senator Jeff Merkely has been instrumental in establishing privacy legislation. 


Senator Jeff Merkely, prior to introducing the National Biometric Information Privacy Act, has been a champion for privacy rights, racial justice and emerging technologies. He played a role in introducing the Facial Recognition and Biometric Technology Act this June. Along with Senator Cory Booker, introduced the Ethical Use of Facial Recognition Act which we reported on earlier this year, and which resulted in a Federal moratorium on the use of facial recognition. This bill was passed shortly thereafter. He has also, in the past, put pressure on the CEO of CLEAR, a biometric identification company, for information on their privacy practices and precautions regarding a new product being marketed to companies for screening employees and customers for coronavirus. In addition, following reports that cars can collect up to 25GB of data per hour, he called upon car manufacturers to report to Congress on whether or not their cars collect personal data from drivers. 


The National Biometric Information Privacy Act is heavily supported by key lawmakers and various groups.


The introduction of this new bill is one of many efforts by this senator in combating what Senator Bernie Sanders has referred to as “a ‘big brother’ surveillance” or “orwellian facial recognition”. This bill is supported by Fight for the Future, the American Civil Liberties Union, Electronic Frontier Foundation, and Open Technology Institute. Neema Singh Guliani, Senior Legislative Counsel at The American Civil Liberties Union said “Biometric identifiers are uniquely sensitive pieces of information that can be used to track who we are and where we go.   Importantly, this bill ensures that companies cannot collect and use these identifiers without strong privacy safeguards. It pairs these safeguards with strong enforcement, allowing consumers to take companies who violate these standards to court,” 


What does the future hold for biometric identification technology in the US, Europe and the world at large? While we currently witness major developments in the capabilities and use of biometric identification technology, there has also been a significant amount of legislation introduced to regulate the use of this technology in various parts of the world. The GDPR, for example, gives everyone the right to object to profiling, including biometric profiling. Furthermore, pursuant to Article 35 of the GDPR, “Where a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data”.


Does your company utilize biometric data such as fingerprinting, voice printing and facial recognition? If yes, failure to adhere fully to the guidelines and rules of the GDPR and Data Protection Act 2018 could result in a hefty financial penalty. Aphaia provides both GDPR adaptation consultancy services and CCPA compliance, including data protection impact assessments, EU AI Ethics assessments and Data Protection Officer outsourcing. Contact us today.