The Oxford Union debate sponsored by KPMG’s Global Cyber Security practice
Society as a whole is perhaps too used to warnings about the need to keep our personal information safe from hackers, fraudsters and thieves, but it is less common to hear warnings about the dangers of consumers’ information falling into the hands of governments.
The issue of government access to personal information was thrown into relief earlier this year by the Apple v FBI case, when Apple refused an FBI request to bypass security on an iPhone belonging to a dead terrorist. The FBI wanted to search the phone for details of possible accomplices, but Apple argued that to comply would create a deliberate flaw in the security system relied on by millions of iPhone users. The company described the request as a “dangerous precedent” and an example of “overreach by the US government”.
In the event, the FBI found other ways to achieve its ends. But the case prompted the Oxford Union, the 200 year-old debating society run by students at Oxford University, where many prominent politicians (including 12 British Prime Ministers) have honed their debating skills, to stage a formal debate. The proposition was, ‘This House Believes Technology Companies Should Prevent Government Access to Consumers’ Data”.
The debate was sponsored by KPMG’s Global Cyber Security practice. Speaking for the proposition was Denelle Dixon-Thayer, Chief Legal and Business Officer of Mozilla, who leads the company’s global business development, public policy, and legal teams. She wrote Mozilla’s highly influential amicus brief in support of Apple. She was joined by John Taysom, a prominent private investor specializing in technology startups.
Opposing the proposition were Matthew Olsen, Director of the US National Counterterrorism Center 2011-2015, formerly Head of the Guantanamo Review Task Force and General Counsel of the National Security Agency, and Air Marshal Chris Nickols CBE, Chief of Defence Intelligence 2009-2012, and former Assistant Chief of the Defense Staff for Operations.
For those opposing the proposition, the main issue was the proper role of technology companies in the fight against crime and terrorism. Mr. Olsen argued that when criminals use modern technologies to communicate with each other, governments have a clear need to tap into those communications to fulfil their role of protecting the population from harm. The proposition would, he said, create a space for consumer information which companies could use for marketing and commercial purposes, but which governments were expressly forbidden from using to detect and prevent crime. This would be absurd.
Air Marshal Nickols reinforced the point with quotes from Andrew Parker, Director General of MI5, the UK’s domestic intelligence and security service, saying that the police and security services have had to foil six terrorist plots in the past year, each of which could have resulted in loss of life equivalent to the Paris attacks in November 2015. Losing the ability to tap into modern communications would seriously reduce the ability of these services to fulfil their role of protecting the population.
He added that the independent reviewer of UK terrorism legislation, David Anderson QC, while highly critical of some aspects of the current system, has asked whether we want deliberately to create areas that are beyond the reach of the law. The Air Marshal’s view was that to do so would be to compromise the massive efforts that governments put in to protecting our freedoms.
The speakers for the proposition took a broader view. They argued that technology companies have no desire to protect criminals, and will readily agree to provide information to government agencies if the request is made lawfully and the courts have given proper legal authority. But what governments are asking for is a “back door”, allowing them to bypass the security systems that tens of millions of people rely on to keep their online transactions safe.
Denelle Dixon-Thayer said that a primary role of technology companies is to protect consumers’ information from any unauthorized intrusion. To do this they need to innovate, and they need to be very good innovators to stay ahead of the hackers and fraudsters who are constantly looking for ways into their security systems. Providing governments with a way in to secure systems would require technology companies to create deliberate flaws in their protection, which would be very quickly picked up and exploited by people other than governments, with aims other than protecting consumers and citizens.
This, she said, would be a serious blow to online commerce, because people would lose their confidence in the security of the systems they use every day for buying, selling, banking and a thousand other purposes. Technology companies simply cannot run the risk of losing that confidence; it is at the core of the service that they sell to consumers and it forms the basis for the trust that consumers have in the companies they use.
The counter-argument from the opposition side was that consumers routinely give up huge amounts of data to technology companies every day without thinking about the security implications. Very few people read the detailed legal agreements they are required to accept before using apps, so few know what data is being collected, how it is being stored, and for what purpose it is being used. If people are so unconcerned about giving their data to commercial interests, why would they be concerned at providing it to governments whose purpose it is to protect them?
John Taysom responded by examining and comparing the legitimate needs of commercial organizations and governments. On one hand, governments protect and tax their citizens, so they need to know who they are. But they do not need to know what they think, who they communicate with or (with the exception of specifically illegal items) what they buy.
On the other hand, commercial organizations like Google legitimately need to know what its customers, think, what they like and what they buy, because they openly use the information to improve the targeting of commercial messages. But they have no need to know who their customers are.
This, he argued, is a key and crucial difference, because it provides a useful guide to the access to data each type of organization can legitimately claim. Governments have no business seeking free access to all consumers’ data because to do so is a threat to democracy.
This opened the debate into a new area; what is the legitimate role of technology companies in society and how should they be regulated? Air Marshall Nickols held that trust in the technology companies, which are all large corporations owned by shareholders, has declined as corporations all over the world have come under suspicion of favoring the interest of their owners over those of their customers. He said that it is not the place of corporations to make decisions that affect the safety and security of their customers or of the countries where they operate. This is something that must remain in the hands of governments.
This idea was extended by the opposition speakers to suggest that consumer data security itself is something that cannot be properly left in the hands of technology companies. As quality and safety in other important areas of commerce like food and transport has increasingly been guaranteed by laws and regulations, so technology companies should be more closely regulated to ensure that the quality of security they provide is as high as it can be. Governments themselves should step in to set and enforce security standards.
John Taysom pointed out that the General Data Protection Regulation, being introduced by the European Union this year and due to come into force in May 2018, will indeed impose more regulation on technology companies in the interests of improving data security. But he added that easier access to data for governments would not necessarily make consumers and citizens safer. He argued that there was little evidence, for example, that the Paris attacks would have been prevented had government agencies had more access to communications data. Indeed, it is believed that the attackers used unencrypted SMS messages in advance of the attacks, and that authorities did know about these communications. The attacks still went ahead.
He made the point that more and more information will not necessarily make it easier to discern patterns of activity. Too much information can conceal as much as it reveals, and where some politicians and officials may be incompetent, ignorant of the facts, or taken up with other matters, giving them more data will not necessarily help them make better decisions.
Matthew Olsen sought to give the debate some historical context by pointing out the other circumstances in which similar discussions had taken place. At every step in the development of communications technology, he said, from the invention of the printing press, through to telephones and computers, the question has always been raised of what access the government of the day should have to the communications of its citizens.
Each time, in debates and legal cases going back over centuries, people have rejected the absolutist idea that there are areas where governments cannot go. Instead, time after time, we have opted for a balanced approach, where governments can go anywhere they need to provided they have a legitimate, legally sanctioned authority to demand the information they want. As it stands, he concluded, the proposition is simply too wide to be accepted.
At the end of a vigorous two hours of debate, the proposition was put to the vote, involving both the 300 people in the debating chamber, and an online audience via Twitter. The combined result was a vote in favor, with 60 percent voting for the proposition and 40 percent voting against.
What was particularly interesting was the difference in the language used, perhaps unconsciously, by the two sides of the debate. The proponents, representing the technology companies, talked about consumers and transactions and the need to know what people do. The opposition, representing the government viewpoint, talked about citizens and individuals, who they are, who they talk to and what they say.
This probably reflects the different priorities that each side has when they are seeking to analyze and use data. But there was general agreement that, when presented with a legal warrant, technology companies would readily comply with government requests for information. Perhaps, when it comes to today’s regulations on government access, these two sides were not as far apart as it might have appeared.
The question remains, though, of whether it is technically possible for tech companies to comply with all demands for information. Companies are already required to retain metadata, but many argue that their systems are deliberately designed to make discovering the actual content of communications effectively impossible. To use an analogy, companies are saying that they have built the house and then thrown away the key; they couldn’t provide the information even if they wanted to.
Critics of the industry don’t find this argument convincing. It’s a widespread view that systems made by human ingenuity can always be unmade by human ingenuity. The history of apparently uncrackable codes that were later cracked would seem to support this idea.
Tech companies know this. They are fully aware that guaranteeing information security is a never-ending task. As Denelle Dickson-Thayer said, providing the highest possible level of security is at the core of the trust between consumers and the tech companies they use. Staying two or three steps ahead of the hackers is necessary if tech companies are to maintain their service standards. It is a common claim that compromising this principle to provide access for government agents could compromise the fundamental business model that underlies internet commerce.
Governments remain to be convinced, and technology companies need to be focusing their attention on future regulation arising from skeptical legislators. Calls for tougher international rules governing use of consumer data have already been answered in the form of the European Union’s General Data Protection Regulation (GDPR).
Four years in the making, this new set of regulations will have a significant impact on how companies around the world handle consumer data when they come into force in May 2018. They tighten rules on accountability, privacy and consent for any company, anywhere in the world, which offers data services to EU citizens. Penalties for breach are substantial; up to 4 percent of annual worldwide turnover for some infringements.
Some companies are only now starting their planning for this change in the law. But industry leaders have seen this coming and should have plans in place in good time. It is, effectively, just the latest example of the law catching up with technical and commercial developments in data mining and processing.
The pace of these developments is not likely to slow down. The commercial value of data can only grow as analysis techniques improve, so the incentive for new companies to enter the market is high and will rise.
Governments, too, are unlikely to let up in their efforts to extract intelligence from big data, by all the means available. Behind the scenes, some of the best hackers in the world are working for governments in Europe, the Americas, and Asia, using a mixture of conventional investigative techniques and analysis of social media to find patterns of behavior.
In public, debates of the quality displayed in Oxford will continue to showcase persuasive arguments for improved government access to data of all kinds.
The proposition may have passed on this occasion but technology companies cannot assume that the argument against greater government access has been won. In this fast-moving and vitally important industry, it is only those who can anticipate, comply with and stay ahead of the advance of legislation that can be sure of success.