Findings, recommendations and actions from ICO investigation into data analytics in political campaigns
Information Commissioner Elizabeth Denham has today published a detailed update of her office’s investigation into the use of data analytics in political campaigns.
In March 2017, the ICO began looking into whether personal data had been misused by campaigns on both sides of the referendum on membership of the EU.
In May it launched an investigation that included political parties, data analytics companies and major social media platforms.
Today’s progress report gives details of some of the organisations and individuals under investigation, as well as enforcement actions so far.
This includes the ICO’s intention to fine Facebook a maximum £500,000 for two breaches of the Data Protection Act 1998.
Facebook, with Cambridge Analytica, has been the focus of the investigation since February when evidence emerged that an app had been used to harvest the data of 50 million Facebook users across the world. This is now estimated at 87 million.
The ICO’s investigation concluded that Facebook contravened the law by failing to safeguard people’s information. It also found that the company failed to be transparent about how people’s data was harvested by others.
Facebook has a chance to respond to the Commissioner’s Notice of Intent, after which a final decision will be made.
Other regulatory action set out in the report comprises:
- warning letters to 11 political parties and notices compelling them to agree to audits of their data protection practices;
- an Enforcement Notice for SCL Elections Ltd to compel it to deal properly with a subject access request from Professor David Carroll;
- a criminal prosecution for SCL Elections Ltd for failing to properly deal with the ICO’s Enforcement Notice;
- an Enforcement Notice for Aggregate IQ to stop processing retained data belonging to UK citizens;
- a Notice of Intent to take regulatory action against data broker Emma’s Diary (Lifecycle Marketing (Mother and Baby) Ltd); and
- audits of the main credit reference companies and Cambridge University Psychometric Centre.
Information Commissioner Elizabeth Denham said:
“We are at a crossroads. Trust and confidence in the integrity of our democratic processes risk being disrupted because the average voter has little idea of what is going on behind the scenes.
“New technologies that use data analytics to micro-target people give campaign groups the ability to connect with individual voters. But this cannot be at the expense of transparency, fairness and compliance with the law.
“Fines and prosecutions punish the bad actors, but my real goal is to effect change and restore trust and confidence in our democratic system.”
A second, partner report, titled Democracy Disrupted? Personal information and political influence, sets out findings and recommendations arising out of the 14-month investigation.
Among the ten recommendations is a call for the Government to introduce a statutory Code of Practice for the use of personal data in political campaigns.
Ms Denham has also called for an ethical pause to allow Government, Parliament, regulators, political parties, online platforms and the public to reflect on their responsibilities in the era of big data before there is a greater expansion in the use of new technologies.
“People cannot have control over their own data if they don’t know or understand how it is being used. That’s why greater and genuine transparency about the use of data analytics is vital.”
In addition, the ICO commissioned research from the Centre for the Analysis of Social Media at the independent thinktank DEMOS. Its report, also published today, examines current and emerging trends in how data is used in political campaigns, how use of technology is changing and how it may evolve in the next two to five years.
The investigation, one of the largest of its kind by a Data Protection Authority, remains ongoing. The 40-strong investigation team is pursuing active lines of enquiry and reviewing a considerable amount of material retrieved from servers and equipment.
The interim progress report has been produced to inform the work of the DCMS’s Select Committee into Fake News.
The next phase of the ICO’s work is expected to be concluded by the end of October 2018.
Notes to Editors
- The Information Commissioner’s Office (ICO) is the UK’s independent regulator for data protection and information rights law, upholding information rights in the public interest, promoting openness by public bodies and data privacy for individuals.
- The ICO has specific responsibilities set out in the Data Protection Act 2018 (DPA2018), the General Data Protection Regulation (GDPR), the Freedom of Information Act 2000 (FOIA), Environmental Information Regulations 2004 (EIR) and Privacy and Electronic Communications Regulations 2003 (PECR).
- The General Data Protection Regulation (GDPR) is a new data protection law which applies in the UK from 25 May 2018. Its provisions are included in the Data Protection Act 2018. The Act also includes measures related to wider data protection reforms in areas not covered by the GDPR, such as law enforcement and security. The UK’s decision to leave the EU will not affect the commencement of the GDPR.
- However, due to the timing of certain incidents in this investigation, civil monetary penalties have to be issued under the previous legislation, the Data Protection Act 1998. The maximum financial penalty in civil cases under former laws is £500,000.
- Under past and current law, the ICO can take action to change the behaviour of organisations and individuals that collect, use and keep personal information. This includes criminal prosecution, non-criminal enforcement and audit.
- Since 25 May 2018, the ICO has the power to impose a civil monetary penalty (CMP) on a data controller of up to £17million (20m Euro) or 4% of global turnover.
- The GDPR and the DPA2018 gave the ICO new strengthened powers, some of which, such as assessment notices can be used for this investigation because certain powers can be used as soon as the GDPR comes into force.
- The data protection principles in the GDPR evolved from the original DPA, and set out the main responsibilities for organisations. Article 5 of the GDPR requires that personal data shall be:
- Processed lawfully, fairly and in a transparent manner in relation to individuals;
- Collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes;
- Adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed;
- Accurate and, where necessary, kept up to date
- Kept in a form which permits identification of data subjects for no longer than is necessary; and
- Processed using appropriate technical or organisational measures in a manner that ensures appropriate security of the personal data.”
Article 5(2) requires that “the controller shall be responsible for, and be able to demonstrate, compliance with the principles.”
- Civil Monetary Penalties (CMPs) under past and current law are subject to a right of appeal to the (First-tier Tribunal) General Regulatory Chamber against the imposition of the monetary penalty and/or the amount of the penalty specified in the monetary penalty notice.
- Any monetary penalty is paid into the Treasury’s Consolidated Fund and is not kept by ICO.
- To report a concern to the ICO go to ico.org.uk/concerns.