Profiling and automated decision-making under GDPR

Profiling and automated decision-making (or ADM) are two areas of the GDPR that have caused a fair degree of confusion for businesses, often with perceived negativity and assumptions that the law significantly restricts most forms of computer-led analysis of data subjects and their activities. Not necessarily so.

As per the general flavour of the GDPR, the law has undoubtedly tightened and places a greater burden and requirements on businesses wishing to carry out profiling or ADM activities. However, there’s still plenty of opportunity for those willing to understand the detail of the law, and more generally align their business models to the core themes of the GDPR.

What is profiling?

Profiling is the automated processing of personal data to evaluate certain things about an individual.

Typical applications of profiling include use of online behavioural advertising (such as targeted online ads based on browsing behaviour), credit scoring as part of a mortgage or finance application and the use of artificial intelligence and machine-learning (for example, for Internet of Things applications).

Profiling is all about evaluation and not decisions; it’s an important distinction. Profiling could form part of an automated decision making activity, but on its own culminates in intelligence and opportunity and not computer- led decisions about an individual.

1. Comply with the transparency obligations of the GDPR

Profiling personal data is a processing activity and therefore caught by the transparency obligations under the GDPR, and your organisation’s privacy notice (or other means of notifying individuals) should set this out.

2. Have a lawful basis for processing (and it’s not all about consent!)

European Data Protection Board (EDPB) guidance suggests it’s unlikely organisations will be able to rely on performance of a contract as the lawful basis for processing, and therefore the two most common lawful bases are:

a. consent – although you will need to show that the individual knows what they are consenting to, so they can make an informed choice, and more generally meet the consent requirements of the GDPR (which is a relatively high bar).

b. legitimate interest – very much an option for many profiling activities. This will require a legitimate interests assessment to be conducted beforehand and particular thought needs to be given to the detail and comprehensiveness of the profile, the impact of the profiling and the safeguards in place to ensure fairness and non-discrimination. Make sure that your assessment is honest, and that the risk outcomes are realistic.

3. Take account of data subject rights

Individuals have the right to object to profiling under the GDPR and therefore this needs to be brought to their attention clearly and separately from other information. Your organisation should have a process in place to handle such objections, particularly where the objection relates to profiling for direct marketing, which is an absolute right.

What is automated decision-making?

You guessed it, a machine makes a decision about an individual. To be more precise, it’s a decision which must:

  • be conducted solely by automated means (i.e. no human intervention); and
  • have a legal or similarly significant effect on an individual.

The first limb is fairly straightforward; if any human intervention is involved (for example, considering the results of the automated decision before applying it to an individual) ,then the activity will not qualify as automated decision-making. However, if a human inputs the data but the decision-making is automated, it still could be considered automated decision-making.

The second limb of the test is a bit more complicated, as although a “legal effect” is fairly easy to define, i.e. something which affects an individual’s legal status/rights (for example, housing or disability benefits), what constitutes “similarly significant effect” is more nebulous.

There are obvious examples of “similarly significant effect”, such as automatic refusal of an online credit application or e-recruiting practices with no human intervention (such as using psychometric testing to filter-out candidates). Guidance points to significantly affecting circumstances, behaviour or choices of individuals, having a prolonged or permanent impact and at its most extreme leading to the exclusion or discrimination of individuals.

Read the full article here.

Over 2,400 courses from Top Institutions

Two weeks in and feeing those lockdown blues?

The only thing getting you up in the morning is to see whether Holly Willoughby has got the kit on and you’ve just added the bin men as your new Facebook friends?

Then look no further and see that we can beat this Coronavirus in more ways than one!

Take a look at  these free online courses from top institutions such as Harvard. MIT, Berkeley and many more. Whether it’s Statistical Inference and Modelling for High Throughput Experiments like i’m on, Aeronautical Engineering, Hypersonic’s, Health & Nutrition or the very topical Preparing for Disruption; there’s a course there for everybody! Most of the courses take around 5-7 weeks to complete and you can watch the lectures at your own pace.

Use this added time wisely and with any luck the pubs will never close again!

Click here for more details….

Analytics: Preparing for the next generation of players

Globally, online gambling is an estimated $20 billion a year industry.

Globally, online gambling is an estimated $20 billion a year industry. An estimated $400 million is estimated to be played annually in Ontario alone[i]. Considering that only a few years ago Canada was considered inexperienced in the iGaming scene, and that traditional gaming revenues remain stagnant (for the most part) while iGaming revenues soar, there is no denying demand for online gaming is on the rise.

Continue reading….