By Julian King, in The Guardian, 28 July 2018, https://www.theguardian.com/commentisfree/2018/jul/28/democracy-threatened-malicious-technology-eu-fighting-back
Attacks on elections and electoral campaigns fall into two main categories: those based on systems and those based on behaviours. The first category includes cyber-attacks that manipulate the electoral process or voting technology to change the number of voters or the number of votes.
But the second category of threat is much more subtle and harmful – attempts at manipulating voting behaviour. In my view, this can take three forms: hacks and leaks designed to change public opinion by revealing damaging information at a crucial point during a campaign; the use of fake news to sway public opinion and influence results; and the misuse of targeted messaging based on psychometrics derived from mined user personality trait data, such as in the Cambridge Analytica case.
So how are we responding to these challenges?
In terms of the systems threat, we drew up with member states a set of common guidelines on how to secure the whole election lifecycle from cyber-attacks. This work, carried out by member states under the leadership of Estonia and Czech Republic in the context of the NIS Cooperation Group, has resulted in a concrete set of recommendations and measures for national authorities in order to protect against “physical” cyber threats, ie the hacking of electronic tools, systems and databases used in the election process.
To counter behavioural threats, the European commission proposed a number of measures in April against disinformation and behavioural manipulation, including important steps that we expect the internet platforms to take to ensure that social media cannot be turned into a weapon against democracies.
We want to see genuine transparency, traceability and accountability online. Users should know who has created the content they are seeing, who might gain from it and why it is being shown to them.
We want platforms to step up their efforts to identify and delete fake accounts and establish clear rules around bots so that they cannot be passed off as human online. We want to make it easier for users to assess the trustworthiness of content, while also reducing the visibility of disinformation. And we would like to see greater clarity around how algorithms work.
Most pressingly, the measures include a code of practice to be adopted by internet platforms, which will require them to improve how adverts are placed, to restrict targeting options for political advertising and to reduce the revenues made by those behind disinformation. It will also promote greater transparency around sponsored content – marking it clearly as such and stating who has paid for it.
The code is being drawn up by representatives of the platforms, the advertising industry and advertisers. A first draft was presented to the commission earlier this month and it represents an important step forward, although it is not yet satisfactory and further work needs to be done in short time with a view to implementing the code by September. A sounding board of fact-checkers, academics, media and civil society organisations is now assessing the draft code and identifying areas for improvement.
In addition, we are organising a series of events for member states and other stakeholders to share best practices on how to keep our democratic processes secure. In October, we will convene a high-level meeting, bringing together national players in order to take stock of progress on the various fronts and to identify and share best practices for election security. This will in turn feed into the annual Colloquium on Fundamental Rights, hosted by my colleague Frans Timmermans, which this year will focus on democracy in the EU.
Beyond our efforts, there are many initiatives at national level where member states are taking measures in the light of forthcoming elections, while there is also strong transatlantic cooperation on this issue, which is discussed in the EU/US security and cyber dialogues and in the context of the Transatlantic Commission on Election Integrity.
We now need to step up this work and ensure that public authorities, as well as other actors – both public and private – are as prepared as possible. That means establishing plans at national level to guard against cyber-attacks and election interference.
To this end, we need every member state to assess comprehensively the threat to their democratic processes and institutions, whether from more traditional cyber-attacks or from the manipulation of information. Political parties themselves need to set an example – they could, for example, consider committing to certain standards of transparency and openness when it comes to their own online campaigns, such as the targeting of political messages via social media, in order to set the right example.
This is an ongoing and urgent issue, not least with the European elections coming up next May, and we are not resting on our laurels – we are constantly in the process of analysing the situation to see if we need to take any more action. Because while elections in Europe may have changed in their appearance over the past decade or so, the underlying need to ensure they are free, fair and without interference has not.
• Sir Julian King is European commissioner for the Security Union

The DCMS select committee’s far-reaching interim report on its 18-month investigation into fake news and the use of data and “dark ads” in elections offers a wide-ranging, informed and sustained critique that carries with it the full weight of parliament. The verdict is withering: Facebook failed. It “obfuscated”, refused to investigate how its platform was abused by the Russian government until forced by pressure from Senate committees and, in the most damning section, it aided and abetted the incitement of racial hatred in Burma, noting that even the company’s chief technical officer, Mike Schroepfer, called this “awful”.
• “Clear legal liability” for tech companies “to act against harmful and illegal content” with failure to act resulting in criminal proceedings.
• Full auditing and scrutiny of tech companies, including their security mechanisms, and full algorithm auditing. Strengthen the Information Commissioner’s Office (ICO). Impose a levy on tech companies operating in the UK to pay for it. The Competition and Markets Authority should investigate fake profiles and advertising fraud.
• A ban on micro-targeted political advertising to similar audiences.
• A “new category of tech company” to be formulated which “tightens tech companies’ liabilities and which is not necessarily either a ‘platform’ or ‘publisher’”.
• Sweeping new powers for the Electoral Commission and a comprehensive overhaul of existing legislation that governs political advertisements during elections.
• A further demand for Mark Zuckerberg “to come to the committee to answer questions to which Facebook has not responded adequately to date”.
• A code of ethics that all tech companies will agree to uphold.
Cambridge Analytica
It was the Cambridge Analytica scandal that blew open the committee’s inquiry in March, though it had already been a focus of its investigations. At its hearing in Washington in February, it had asked a series of pointed questions of Simon Milner, a Facebook executive, whose answers the report describes as “disingenuous”. The report includes:
• New allegations of an undercover sting carried out by a “temporary SCL [parent company of Cambridge Analytica] employee” who was paid £10,000 by Alexander Nix to bribe a politician in St Kitts in the Caribbean.

• Details about Cambridge Analytica’s relationship with Henley & Partners, a passport investment company that has programmes in St Kitts and Malta. The committee also noted that Daphne Caruana Galizia, an investigative journalist, who was murdered in October last year, was investigating the company, highlighted the damage that these passport sales were inflicting on both Malta and the European Union. The report urges the NCA to investigate SCL’s work with Henley & Partners. It also notes Lord Ashcroft had recently extolled the virtues of Malta as the “best destination for ambitious UK firms” to have a post-Brexit presence in the EU.
• “Really worrying” evidence that SCL also worked for the UK government, had provided psychological operation training for Ministry of Defence staff, had received classified information about Afghanistan and, according to Nix’s testimony, had “secret clearance”. The report urges the government to ensure the NCA investigates.
• Concerns about “the administrator’s proposals in connection with SCL Elections Ltd, as listed in Companies House, and the fact that Emerdata Ltd is listed as the ultimate parent company of SCL Elections Ltd, and is the major creditor and owed £6.3m”. Recommends the NCA investigates.
Russia
He said that the committee believed the evidence it had received so far from Facebook represented only the “tip of the iceberg”.
The report includes:
• A call for a full investigation into contacts between Arron Banks – millionaire backer of Leave.EU – and Russian officials and his business dealings with Russian companies. “We understand the National Crime Agency is investigating these matters. We believe that they should be given full access to any relevant information that will aid their inquiry.”
• A call for a full investigation into the source of Banks’s donations to the referendum “to verify that the money was not sourced from abroad”. It notes that “should there be any doubt, the matter should be referred to the NCA”.
• A proposal for limits in future elections on individual donors and for major donors to “demonstrate the source of their donations”.
• A demand for further investigation by Facebook of Russian interference on its platform. Facebook’s denial of “direct Russian interference using Facebook in the Brexit referendum” was “disingenuous and typical of Facebook’s handling of our questions”.
• Calls for a full ICO report on Aleksandr Kogan, the Russian-born scientist who was working in Russia with grants from the Russian government while harvesting Facebook data for Cambridge Analytica.
• A request for more information about the ICO’s statement that Cambridge Analytica’s systems had been “accessed from IP addresses that resolve to Russia and other areas of the CIS [Commonwealth of Independent States]”.
• A statement from the British government on “how many investigations are currently being carried out into Russian interference in UK politics”. It says: “It would be wrong for Robert Mueller’s inquiry to take the lead about related issues in the UK.” It calls for coordination between all relevant authorities.