Critical choices like these often require personal data as their input, but there is a risk that organisations are failing to consider existing data regulations – and the incoming GDPR – when developing the algorithms and automated services that work behind the scenes in popular apps and websites. It is almost certain that processing and manipulation is happening behind the scenes in day-to-day life. It is also the case that as data is increasingly being processed through automated processes – including tasks where AI is used – that robots are performing tasks and making decisions within parameters that vary hugely between organisations. The problem was underlined by recent calls from a research team at the Alan Turing Institute for an independent watchdog to investigate AI decisions, so those who have been discriminated against can challenge these. The Article 29 Working Party has recently issued guidance in which it recognises the benefits of automated decision making and profiling, but highlights the significant risks for individuals without safeguards. While existing data regulations contain provisions regarding profiling, and the GDPR just makes minor amendments to clarify that explicit consent of the data subject is a valid basis for automated profiling provided there are appropriate safeguards in place, the regulation provides a timely reminder of how companies need to approach the complexity of data and artificial intelligence. An example of the potential pitfalls was outlined in a recent example involving a major British media company. The company was developing an application targeted at parents that would determine what their viewing recommendations might be around 3pm. It was also designed to measure which proportion would be watching children’s programmes when they came home from school with their children. But the firm, which has taken a proactive approach to GDPR and data handling, realised it would would mean potentially capturing children’s data with the algorithm. It changed the code to make sure that children’s data did not feed into the mix of recommendations. The company was right to be cautious. Incoming data regulations provide that children merit specific protection particularly where processing activities involve marketing to children or creating personality or user profiles of children. The regulation is also firm on automated decisions and a lack of human intervention. Emulating existing data protection laws, it restates the right not to be subject to decisions made solely on the basis of automated processing, including profiling, which significantly affects them. Existing laws also established that where profiling is used, there must be appropriate safeguards in place to protect the data subject’s interests, rights and freedoms – these include the right for the data subject to have human intervention, express his/her view and contest the decision. Appropriate mathematical or statistical procedures should be used for profiling and meaningful information provided regarding the logic, the significance and the envisaged consequences in order to meet the requirement for transparency. This underlines that, especially where AI and automated decisions are concerned, it is imperative that companies understand the need to adhere to robust data regulations by design and by default. Such principles will prove crucial in the future as consumers will have much more power to drive business to or away from an organisation depending upon how they deal with their data. The power shift has been likened to the effect of market disruptions like the “TripAdvisor effect” on the leisure and hospitality sector. In the TripAdvisor scenario, those hotels and travel companies that embraced the reviews system, which responded to the comments and that most importantly made changes, are the ones that have succeeded. The same will apply for GDPR – the organisations that embrace the principles of accountability and transparency will thrive, and those that do not will struggle. Organisations can start to close off the gaps in preparation – and those who prepare by organising their data and considering their supply chain and relationship with data subjects will be in the strongest position. AI can be a complex and uncharted area, but getting GDPR right is not an insurmountable task. Sarah Williamson is a partner in the commercial and technology team at Boyes Turner
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.