Is effective AI regulation possible?

On our Youtube channel this week, we are discussing how AI is currently regulated, challenges and what the future could look like!

One of the main challenges of AI is avoiding discrimination in its application and results, which requires one to apply several controls on the datasets used for training and the ethical and general principles that should be part of the design, plus perform regular checks.

An additional issue the legislator should tackle relates to the person or persons who should be liable where the AI breaks the law (Liability). This is also linked to international law, as Courts will have to discern not only who is responsible but also what the jurisdiction or applicable law ought to be.

For example, one could argue this could be: the country where the AI was programmed or the country where the device that includes the AI was manufactured. The severity of these decisions dials up a notch when it comes to life-changing decisions, such as the choice of who should live or die in a car accident. How do you decide?

What’s more, critics are concerned with the issue of self-development or ‘deep learning’ when it comes to AI. What if we cannot control our AI? Should we impose some limits? If so, what limits?

And what about the issue of AI being used as a weapon? Suggestions to limit some very specific applications of AI seem to merit much closer examination and action. A major case in point is the development of autonomous weapons that employ AI to decide when to fire, how much force to apply, and on what targets.

In our video this week we answer the question, is effective AI regulation possible?

We would also love to hear what you think, share your thoughts with us!

If you need advice on your AI product, Aphaia offers both AI ethics and Data Protection Impact Assessments.

British Airways data breach fine set at £183m based on GDPR

British Airways is facing a record fine of £183m data breach of its security system.

The GDPR imposes stiff fines on data controllers and processors for non-compliance. On the one hand a company can either be fined up to €10 million, or 2% of the worldwide annual revenue of the prior financial year. On the other, It can be fine up to €20 million, or 4% of the worldwide annual revenue of the prior financial year. The total proposed BA fine of £183.39 million would be the biggest penalty ever issued by the ICO​. It is the equivalent of 1.5% of BA’s global turnover for the financial year ending December 31.

What happened?

The fine relates to the theft of customers’ personal and financial information between June 2018 and September 2018 from the website ba.com and the airline’s mobile app. The airline initially said around 380,000 payment cards had been compromised, however the ICO said in a statement that the personal information of 500,000 customers had been affected.

The ICO said the incident took place after users of British Airways’ website were diverted to a fraudulent site. Through this false site, details of about 500,000 customers were harvested by the attackers,

An ICO spokeswoman made clear that the figure was an initial notice of a fine and that the figure of £183.39m would be the largest ever issued by the ICO.

Information Commissioner Elizabeth Denham said: “People’s personal data is just that – personal. When an organisation fails to protect it from loss, damage or theft, it is more than an inconvenience.

“That’s why the law is clear – when you are entrusted with personal data, you must look after it. Those that don’t will face scrutiny from my office to check they have taken appropriate steps to protect fundamental privacy rights.”

What information was stolen?

According to the ICO, a variety of information was “compromised” by poor security arrangements at the company, including log in, payment card, and travel booking details as well name and address information.

BA initially said information involved included only names, email addresses, credit card information such as credit card numbers, expiry dates and the three-digit CVV code found on the back of credit cards.

Data protection regulators in other European countries will also be able to make representations on the scale of the fine because of the impact on their citizens. The money raised will be divided between the data regulation authorities across Europe with the money allocated to the ICO going to the Treasury.

Do you require assistance with GDPR and Data Protection Act 2018 compliance? Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing.

Greenwich University data breach

The ICO fined Greenwich University  £120,000 for failing to prevent  a serious data breach.

 The breach disclosed the data of 19,500 students. This occurred due to a microsite developed by an academic and a student in the then-devolved University’s Computing and Mathematics School, to facilitate a training conference in 2004. The data included names, addresses, dates of birth, phone numbers, signatures. Roughly around 3,500 of these included sensitive data such as information on extenuating circumstances, details of learning difficulties and staff sickness records and was subsequently posted online.

Greenwich was the first university to receive a fine under the Data Protection Act. One should note that the site was not subsequently closed down or secured after the conference in 2004, and was first compromised in 2013. In 2016 multiple attackers exploited the vulnerability of the site allowing them to access other areas of the web server.

The university did not appeal against the ICO decision. Instead, University Secretary Peter Garrod said “we acknowledge the ICO’s findings and apologise again to all those who may have been affected”. He added that “No organisation can say it will be immune to unauthorised access in the future, but we can say with confidence to our students, staff, alumni and other stakeholders, that our systems are far more robust than they were two years ago as a result of the changes we have made”

The Commissioner found that the University did not have in place appropriate technical and organisational measures for ensuring, so far as possible, that such a security breach would not occur.

Do you require assistance with GDPR and Data Protection Act 2018 compliance? Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing.

EE fined by ICO for sending unlawful texts

EE has been fined £100,000 by the ICO for sending 2.5 million unconsented direct marketing messages to its customers. We explain what companies should watch out for when sending direct marketing messages.

 The messages, sent in early 2018, encouraged customers to access and use the ‘My EE’ app to manage their account and also to upgrade their phone; the second batch of messages was sent to customers who had not engaged with the first.

The telecoms company stated during the investigation that it had sent the text messages to provide service information so they were not covered by electronic marketing law. However, the ICO ruled that the messages contained direct marketing, as they contained promotional material.

 When can a company send direct marketing messages? 

Andy White, ICO Director of Investigations said: ”These were marketing messages which promoted the company’s products and services. The direct marketing guidance is clear: if a message that contains customer service information also includes promotional material to buy extra products for services, it is no longer a service message and electronic marketing rules apply. … EE Limited were aware of the law and should have known that they needed customers’ consent to send them in line with the direct marketing rules. … Companies should be aware that texts and emails providing service information which also includes a marketing or promotional element must comply with the relevant legislation or could face a fine up to £500,000.”

Dr Bostjan Makarovic, Aphaia Partner, comments: “Companies sending direct marketing emails or text messages must either obtain explicit consent or enable opt-out both at the point of gathering contact details and in each individual message they send. Moreover, if relying on opt-out, any marketing is restricted to similar goods and services that are offered by the same company.”

EE acknowledges and accepts the ICO’s findings and decision, moving forward they are working on improving their process. “We’re committed to ensuring our customers are fully aware of their options throughout the life of their contract, and we apologise to the customers who received these messages.”

Do you require assistance with GDPR and Data Protection Act 2018 compliance? Aphaia provides both GDPR adaptation consultancy services, including data protection impact assessments, and Data Protection Officer outsourcing.