Articles

Designing Internet technologies for the public good

People are very often unaware of how much data is gathered about them—let alone the purposes for which it can be used.

MEPs failed to support a Green call to protect Edward Snowden as a whistleblower, in order to allow him to give his testimony to the European Parliament in March. Image by greensefa.

Computers have developed enormously since the Second World War: alongside a rough doubling of computer power every two years, communications bandwidth and storage capacity have grown just as quickly. Computers can now store much more personal data, process it much faster, and rapidly share it across networks.

Data is collected about us as we interact with digital technology, directly and via organisations. Many people volunteer data to social networking sites, and sensors—in smartphones, CCTV cameras, and “Internet of Things” objects—are making the physical world as trackable as the virtual. People are very often unaware of how much data is gathered about them—let alone the purposes for which it can be used. Also, most privacy risks are highly probabilistic, cumulative, and difficult to calculate. A student sharing a photo today might not be thinking about a future interview panel; or that the heart rate data shared from a fitness gadget might affect future decisions by insurance and financial services (Brown 2014).

Rather than organisations waiting for something to go wrong, then spending large amounts of time and money trying (and often failing) to fix privacy problems, computer scientists have been developing methods for designing privacy directly into new technologies and systems (Spiekermann and Cranor 2009). One of the most important principles is data minimisation; that is, limiting the collection of personal data to that needed to provide a service—rather than storing everything that can be conveniently retrieved. This limits the impact of data losses and breaches, for example by corrupt staff with authorised access to data—a practice that the UK Information Commissioner’s Office (2006) has shown to be widespread.

Privacy by design also protects against function creep (Gürses et al. 2011). When an organisation invests significant resources to collect personal data for one reason, it can be very tempting to use it for other purposes. While this is limited in the EU by data protection law, government agencies are in a good position to push for changes to national laws if they wish, bypassing such “purpose limitations.” Nor do these rules tend to apply to intelligence agencies.

Another key aspect of putting users in control of their personal data is making sure they know what data is being collected, how it is being used—and ideally being asked for their consent. There have been some interesting experiments with privacy interfaces, for example helping smartphone users understand who is asking for their location data, and what data has been recently shared with whom.

Smartphones have enough storage and computing capacity to do some tasks, such as showing users adverts relevant to their known interests, without sharing any personal data with third parties such as advertisers. This kind of user-controlled data storage and processing has all kinds of applications—for example, with smart electricity meters (Danezis et al. 2013), and congestion charging for roads (Balasch et al. 2010).

What broader lessons can be drawn about shaping technologies for the public good? What is the public good, and who gets to define it? One option is to look at opinion polling about public concerns and values over long periods of time. The European Commission’s Eurobarometer polls reveal that in most European countries (including the UK), people have had significant concerns about data privacy for decades.

A more fundamental view of core social values can be found at the national level in constitutions, and between nations in human rights treaties. As well as the protection of private life and correspondence in the European Convention on Human Rights’ Article 8, the freedom of thought, expression, association and assembly rights in Articles 9-11 (and their equivalents in the US Bill of Rights, and the International Covenant on Civil and Political Rights) are also relevant.

This national and international law restricts how states use technology to infringe human rights—even for national security purposes. There are several US legal challenges to the constitutionality of NSA communications surveillance, with a federal court in Washington DC finding that bulk access to phone records is against the Fourth Amendment [1] (but another court in New York finding the opposite [2]). The UK campaign groups Big Brother Watch, Open Rights Group, and English PEN have taken a case to the European Court of Human Rights, arguing that UK law in this regard is incompatible with the Human Rights Convention.

Can technology development be shaped more broadly to reflect such constitutional values? One of the best-known attempts is the European Union’s data protection framework. Privacy is a core European political value, not least because of the horrors of the Nazi and Communist regimes of the 20th century. Germany, France and Sweden all developed data protection laws in the 1970s in response to the development of automated systems for processing personal data, followed by most other European countries. The EU’s Data Protection Directive (95/46/EC) harmonises these laws, and has provisions that encourage organisations to use technical measures to protect personal data.

An update of this Directive, which the European parliament has been debating over the last year, more explicitly includes this type of regulation by technology. Under this General Data Protection Regulation, organisations that are processing personal data will have to implement appropriate technical measures to protect Regulation rights. By default, organisations should only collect the minimum personal data they need, and allow individuals to control the distribution of their personal data. The Regulation would also require companies to make it easier for users to download all of their data, so that it could be uploaded to a competitor service (for example, one with better data protection)—bringing market pressure to bear (Brown and Marsden 2013).

This type of technology regulation is not uncontroversial. The European Commissioner responsible until July for the Data Protection Regulation, Viviane Reding, said that she had seen unprecedented and “absolutely fierce” lobbying against some of its provisions. Legislators would clearly be foolish to try and micro-manage the development of new technology. But the EU’s principles-based approach to privacy has been internationally influential, with over 100 countries now having adopted the Data Protection Directive or similar laws (Greenleaf 2014).

If the EU can find the right balance in its Regulation, it has the opportunity to set the new global standard for privacy-protective technologies—a very significant opportunity indeed in the global marketplace.

[1] Klayman v. Obama, 2013 WL 6571596 (D.D.C. 2013)

[2] ACLU v. Clapper, No. 13-3994 (S.D. New York December 28, 2013)

References

Balasch, J., Rial, A., Troncoso, C., Preneel, B., Verbauwhede, I. and Geuens, C. (2010) PrETP: Privacy-preserving electronic toll pricing. 19th USENIX Security Symposium, pp. 63–78.

Brown, I. (2014) The economics of privacy, data protection and surveillance. In J.M. Bauer and M. Latzer (eds.) Research Handbook on the Economics of the Internet. Cheltenham: Edward Elgar.

Brown, I. and Marsden, C. (2013) Regulating Code: Good Governance and Better Regulation in the Information Age. Cambridge, MA: MIT Press.

Danezis, G., Fournet, C., Kohlweiss, M. and Zanella-Beguelin, S. (2013) Smart Meter Aggregation via Secret-Sharing. ACM Smart Energy Grid Security Workshop.

Greenleaf, G. (2014) Sheherezade and the 101 data privacy laws: Origins, significance and global trajectories. Journal of Law, Information & Science.

Gürses, S., Troncoso, C. and Diaz, C. (2011) Engineering Privacy by Design. Computers, Privacy & Data Protection.

Haddadi, H, Hui, P., Henderson, T. and Brown, I. (2011) Targeted Advertising on the Handset: Privacy and Security Challenges. In Müller, J., Alt, F., Michelis, D. (eds) Pervasive Advertising. Heidelberg: Springer, pp. 119-137.

Information Commissioner’s Office (2006) What price privacy? HC 1056.

Spiekermann, S. and Cranor, L.F. (2009) Engineering Privacy. IEEE Transactions on Software Engineering 35 (1).


Read the full article: Keeping our secrets? Designing Internet technologies for the public good, European Human Rights Law Review 4: 369-377. This article is adapted from Ian Brown’s 2014 Oxford London Lecture, given at Church House, Westminster, on 18 March 2014, supported by Oxford University’s Romanes fund.

Professor Ian Brown is Associate Director of Oxford University’s Cyber Security Centre and Senior Research Fellow at the Oxford Internet Institute. His research is focused on information security, privacy-enhancing technologies, and Internet regulation.