ABTL Blog Post – Written by Rochelle Inbakumar. Rochelle is entering her second year of Law at City University, London in September 2022 and writes on data protection concerns since the 2011 News of the World scandal.
This month marks the 11th anniversary of the News of the World scandal, which woke Britain up to the data breaches happening before them. In 2011, it emerged that journalists had been tapping into the phone conversations of the ordinary public, right up to members of the royal family. It should be noted, though, that this was just the beginning of flagging the importance of data protection to society, as the tech boom of the 2010s paved the way to continue where News of the World left off.
There have certainly been significant strides since the events of 2011, most significantly The Data Protection Act. It set out to ensure the proper data handling, ensuring its “fair, lawful and transparent use”. This spans various categories, with some such as race, religious beliefs, and sexual orientation gaining further legal protection. This has caused shifts in the right direction, especially as Google now plans to phase out the use of third-party cookies by late 2023. The individual now has access to more rights, such as accessing the data used, a testament to the argument used in ZXC v Bloomberg LLP. In this case, the Supreme Court ruled that ‘Z’ has a right to privacy despite Bloomberg shedding light on Z’s undergoing criminal investigation. Citing Lord Sumption from a previous case, “the protection of reputation is the primary function of the law of defamation” seemed to be the rhetoric upheld. It appears as though data laws will expand as far as protecting the individual, even if their data appears to be of the public interest.
Nevertheless, that is not to say that the floodgates will be opened for class actions, as the seminal decision in Lloyd v Google concluded. Here, the Supreme Court found in favour of Google (and subsequently, Big Tech), as it would simply be impossible to compensate everyone involved given its non-material nature. Google’s potential liability could have exceeded £3bn if successful. Primarily, however, the Supreme Court’s decision in both instances has been accepted, thus proving that by handling each issue on a case-by-case basis, the courts can assure fair outcomes through data protection law.
Nonetheless, recent times have seen the catapult of data protection law usage in British courts. The coronavirus pandemic saw a soar in online users during lockdowns; the unprecedented growth proved a challenge in keeping people up to date with tech changes. Even the government themselves were susceptible to failures. In rolling out the Test and Trace programme, the Department of Health and Social Care had multiple data breaches including “email mishaps and unredacted personal information”. In an interview with The Guardian, Executive Director of the Open Rights Group (ORG), Jim Killock, noted that “the reckless behaviour of this government in ignoring a vital and legally required safety step known as the data protection impact assessment (DPIA) has endangered public health”. Additionally, Big Tech has undoubtedly held its hands up to mistakes too many times, sometimes with irretrievable damage. Just earlier this year, the UK government set out plans to empower the Digital Markets Unit (DMU) to “clamp down on predatory practices”, including the new rules to “give users more control of their data”. This comes after the multiple controversies regarding data breaches post the events of 2011, such as the Cambridge Analytica scandal, where it was revealed that Facebook failed to “keep users’ personal information secure” through tactics such as data harvesting.
As we step into a new era of a data-driven world, it is up to society and the courts to develop their expectations of Big Data companies and decide where to draw the line. All eyes are now on the upcoming Bill of Rights, which is in its second reading (set to replace the current Human Rights Act), although yet to address any form of data protection as a fundamental right of the individual. It demonstrates that whilst governments can provide the legislation to keep them in tow, it is ultimately down to the companies we trust with our data not to misuse public confidence.