University of Cambridge Computer Laboratory foto

Ross Anderson

[Research] [Blog] [Politics] [My Book] [Music] [Contact Details]

What's New

Here is a survey paper on security economics which I gave at Softint 2007 on January 20th; my slides; and a shorter survey that was recently published in Science. See also my Economics and Security Resource Page.

2006 blog highlights included technical papers on topics from protecting power-line communications to the Man-in-the-Middle Defence, as well as a major report on the safety and privacy of children's databases for the UK Information Commissioner, which got a lot of publicity. I ended the year by debating health privacy on the Today programme with health minister Lord Warner, who resigned shortly aftewards. 2005 blog highlights included research papers on The topology of covert conflict, on combining cryptography with biometrics, on Sybil-resistant DHT routing, and on Robbing the bank with a theorem prover; and a survey paper on cryptographic processors, a shortened version of which appeared in the February 2006 Proceedings of the IEEE. 2004 highlights included papers on cipher composition, key establishment in ad-hoc networks and the economics of censorship resistance. I also lobbied for amentments to the EU IP Enforcement Directive and organised a workshop on copyright which led to a common position adopted by many European NGOs.


Research

I am Professor of Security Engineering at the Computer Laboratory. I supervise a number of research students - Jolyon Clulow, Hao Feng, Stephen Lewis, Tyler Moore, Shishir Nagaraja Andy Ozment and Robert Watson. Richard Clayton, Steven Murdoch and Sergei Skorobogatov are postdocs. Mike Bond, Vashek Matyas and Andrei Serjantov are former postdocs. Jong-Hyeon Lee, Frank Stajano, Fabien Petitcolas, Harry Manifavas, Markus Kuhn, Ulrich Lang, Jeff Yan, Susan Pancho, Mike Bond, George Danezis, Sergei Skorobogatov, Hyun-Jin Choi and Richard Clayton have earned PhDs.

My other research interests include:

Many of my papers are available in html and/or pdf, but some of the older technical ones are in postscript, for which you can download a viewer here. By default, when I post a paper here I license it under the relevant Creative Commons license, so you may redistribute it but not modify it. I may subsequently assign the residual copyright to an academic publisher.

Economics of information security

Over the last few years, it's become clear that many systems fail not for technical reasons so much as from misplaced incentives - often the people who could protect them are not the people who suffer the costs of faulure. There are also many questions with an economic dimension as well as a technical one. For example, will digital signatures make electronic commerce more secure? Is so-called `trusted computing' a good idea, or just another way for Microsoft to make money? And what about all the press stories about `Internet hacking' - is this threat serious, or is it mostly just scaremongering by equipment vendors? It's not enough for security engineers to understand ciphers; we have to understand incentives as well. This has led to a rapidly growing interest in `security economics', a discipline which I helped to found. I maintain the Economics and Security Resource Page, and my research contributions include the following.

Our annual bash is the Workshop on Economics and Information Security. My Economics and Security Resource Page provides a guide to the literature and to what else is on. There is also a web page on the economics of privacy, maintained by Alessandro Acquisti.

Peer-to-Peer systems

Since about the middle of 2000, there has been an explosion of interest in peer-to-peer networking - the business of building useful systems out of large numbers of intermittently connected machines, with virtual infrastructures that are tailored to the application. One of the seminal papers in the field was The Eternity Service, which I presented at Pragocrypt 96. I had been alarmed by the Scientologists' success at closing down the penet remailer in Finland, and had been personally threatened by bank lawyers who wanted to suppress knowledge of the vulnerabilities of ATM systems (see here for a later incident). This taught me that electronic publications can be easy for the rich and the ruthless to suppress. They are usually kept on just a few servers, whose owners can be sued or coerced. To me, this seemed uncomfortably like books in the Dark Ages: the modern era only started once the printing press enabled seditious thoughts to be spread too widely to ban. The Eternity Service was conceived as a means of putting electronic documents as far outwith the censor's grasp as possible. (The concern that motivated me has since materialised; a UK court judgment has found that a newspaper's online archives can be altered by order of a court to remove a libel.)

But history never repeats itself exactly, and the real fulcrum of censorship in cyberspace turned out to be not sedition, or vulnerability disclosure, or even pornography, but copyright. Hollywood's action against Napster led to my Eternity Service ideas being adopted by many systems including Publius and Freenet. Many of these developments were described in an important book, and the first academic conference on peer-to-peer systems was held in March 2002 at MIT. The field has since become very active. See also Richard Stallman's classic, The Right to Read.

My contributions since the Eternity paper include:

I ran a CMI project with Frans Kaashoek and Robert Morris on building a next-generation peer-to-peer system. I gave a keynote talk about this at the Wizards of OS conference in Berlin; the slides are here.

Robustness of cryptographic protocols

Many security system failures are due to poorly designed protocols, and this has been a Cambridge interest for many years. Some relevant papers follow.

Protocols have been the stuff of high drama. Citibank asked the High Court to gag the disclsoure of certain crypto API vulnerabilities that affect a number of systems used in banking. I wrote to the judge opposing this; a gagging order was still imposed, although in slightly less severe terms than Citibank had requested. The trial was in camera, the banks' witnesses didn't have to answer questions about vulnerabilities, and new information revealed about these vulnerabilities in the course of the trial may not be disclosed in England or Wales. Information already in the public domain was unaffected. The vulnerabilities were discovered by Mike Bond and me while acting as the defence experts in a phantom withdrawal court case, and independently discovered by the other side's expert, Jolyon Clulow, who later joined us as a research student. They are of significant scientific interest, as well as being relevant to the rights of the growing number of people who suffer phantom withdrawals from their bank accounts worldwide. Undermining the fairness of trials and forbidding discussion of vulnerabilities isn't the way forward. See press coverage by the New Scientist, the Register, Slashdot, news.com, and Zdnet.


Reliability of security systems

I have been interested for many years in how security systems fail in real life. This is a prerequisite for building robust secure systems; many security designs are poor because they are based on unrealistic threat models. This work began with a study of automatic teller machine fraud, and then expanded to other applications as well. It now provides the central theme of my book.

The papers on physical security by Roger Johnston's team are also definitely worth a look, and there's an old leaked copy of the NSA Security Manual that you can download (also as latex).


Analysis and design of cryptographic algorithms

Reports of an attack on the hash function SHA have made Tiger, which Eli Biham and I designed in 1995, a popular choice of cryptographic hash function. I also worked with Eli, and with Lars Knudsen, to develop Serpent - a candidate block cipher for the Advanced Encryption Standard. Serpent won through to the final of the competition and got the second largest number of votes. Another of my contributions was founding the series of workshops on Fast Software Encryption.

Other papers on cryptography and cryptanalysis include the following.


Information hiding (including Soft Tempest)

From the mid- to late-1990s, I did a lot of work on information hiding.


Security of Medical Information Systems

Medical information security is a subject in which I've worked on and off for over a decade. It's highly topical right now: the UK government is building a national database of medical records, a project which many doctors oppose; half of all GPs say they won't upload their patients' data. Ministers have given a guarantee of patient privacy, but GPs, NGOs and commentators are sceptical. There are radio pieces on the problems here and here, and comments on broken government promises here. There is an article with some examples of privacy abuses, and a report that the Real IRA penetrated the Royal Victoria Hospital in Northern Ireland and used its electronic medical records to gather information on policemen to target them and their families for murder. A particularly shocking case was that of Helen Wilkinson, who needed to organise a debate in Parliament to get ministers to agree to remove defamatory and untrue information about her from NHS computers. The minister assured the House that the libels had been removed; months later, they still had not been.

Civil servants started pushing for online access to everyone's records in 1992 and I got involved in 1995, when I started consulting for the British Medical Association on the safety and privacy of clinical information systems. Back then, the police were given access to all drug prescriptions in the UK, after the government argued that they needed it to catch the occasional doctor who misprescribed heroin. The police got their data, they didn't catch Harold Shipman, and no-one was held accountable.

The NHS slogan in 1995 was `a unified electronic patient record, accessible to all in the NHS'. The slogan has changed several times, but the goal remains the same. The Health and Social Care (Community Health and Standards) Act allowed the Government access to all medical records in the UK, for the purposes of `Health Improvement'. It removed many of the patient privacy safeguards in previous legislation. In addition, the new contract offered to GPs since 2003 moves ownership of family doctor computers to Primary Care Trusts (that's health authorities, in oldspeak). There was a token consultation on confidentiality; the Foundation for Information Policy Research, which I chair, published a response to it (which was of course ignored).

The last time people pointed out that NHS administrators were helping themselves illegally to confidential personal health information, Parliament passed some regulations on patient privacy to legalise those questionable practices that had been brought to public attention. For example, the regulations compel doctors to give the government copies of all records relating to infectious disease and cancer. The regulations were made under an Act that was rushed through in the shadow of the 2001 election and that gives ministers broad powers to nationalise personal health information. For the background to that Act, see an editorial from the British Medical Journal, a discussion paper on the problems that the bill could cause for researchers, and an impact analysis commissioned by the Nuffield Trust. Ministers claimed the records were needed for cancer registries: yet cancer researchers in many other countries work with anonymised data (see papers on German cancer registries here and here, and the website of the Canadian Privacy Commissioner.) There was contemporary press coverage in the Observer, the New Statesman, and The Register; and Hansard reports the Parliamentary debate on the original bill in the Commons and the Lords.

In the end, perhaps only a European law challenge can halt the slide toward surveillance. The regulations appear to breach the Declaration of Helsinki on ethical principles for medical research, and contravene the Council of Europe recommendation no R(97)5 on the protection of medical data, to which Britain is a signatory. There is a list of some more of the problems here, and a letter we wrote to the BMJ here.

Some relevant papers of my own follow. They are mostly from the 1995-6 period, when the government last tried to centralise all medical records - and we saw them off.

Two health IT papers by colleagues deserve special mention. Privacy in clinical information systems in secondary care describes a hospital system implementing something close to the BMA security policy (it is described in more detail in a special issue of the Health Informatics Journal, v 4 nos 3-4, Dec 1998, which I edited). Second, Protecting Doctors' Identity in Drug Prescription Analysis describes a system designed to de-identify prescription data for commercial use; although de-identification usually does not protect patient privacy very well, there are exceptions, such as here. This system led to a court case, in which the government tried to stop its owner promoting it - as it would have competed with their (less privacy-friendly) offerings. The government lost: the Court of Appeal decided that personal health information can be used for research without patient consent, so long as the de-identification is done competently.

A first-class collection of links to papers on the protection of de-identified data is maintained by the American Statistical Association. Bill Lowrance wrote a good survey for the US Department of Health and Human Services of the potential for using de-identified data to protect patient privacy in medical research, while a report by the US General Accounting Office shows how de-identified records are handled much better by Medicare than by the NHS. For information on what's happening in the German speaking world, see Andreas von Heydwolff's web site and Gerrit Bleumer's European project links. Resources on what's happening in the USA - where medical privacy is a very live issue - include many NGOs: Patient Privacy Rights, EPIC, the Privacy Rights Clearinghouse, the Medical Privacy Coalition, the Citizens' Council on Health Care, CPT, the Institute for Health Freedom. and Georgetown University's health privacy project (which has a comprehensive survey of US health privacy laws). Other resources include a report from the US National Academy of Sciences entitled For the Record: Protecting Electronic Health Information and a report by the US Office of Technology Assessment.


Public policy issues

I chair the Foundation for Information Policy Research, which I helped set up in 1998. This body is concerned with promoting research and educating the public in such topics as the interaction between computing and the law, and the social effects of IT. We are not a lobby group; our enemy is ignorance rather than the government of the day, and one of our main activities is providing accurate and neutral briefing for politicians and members of the press. Here's an overview of the issues as we saw them in 1999; some highlights of our work follow.

My pro-bono work also includes sitting on Council, our University's governing body. I stood for election because I was concerned about the erosion of academic freedom under the previous administration. See, for example a truly shocking speech by Mike Clark at a recent discussion on IPR. Mike tells how our administration promised a research sponsor that he would submit all his relevant papers to them for prior review - without even asking him! It was to prevent abuses like this that we founded the Campaign for Cambridge Freedoms, whose goal was to defeat a proposal by the former Vice Chancellor that most of the intellectual property generated by faculty members - from patents on bright ideas to books written up from lecture notes - would belong to the university rather than to the person who created them. If this had passed, Cambridge would have swapped one of the most liberal rules on intellectual property of any British university, for one of the most oppressive anywhere. Over almost four years of campaigning we managed to draw many of the teeth of this proposal.

A recent vote approved a policy in which academics keep copyright but the University gets 15% of patent royalties. The policy is howerer defective in many ways: for example, it allows the University to do IPR deals without the consent of affected staff and students. The authorities have undertaken to introduce amendments.

Finally, here is my PGP key. If I revoke this key, I will always be willing to explain why I have done so provided that the giving of such an explanation is lawful. (For more, see FIPR.)


My Book on Security Engineering

cover

Wiley has finally agreed to let me put my book online! You can download it here.

Security engineering is about building systems to remain dependable in the face of malice, error or mischance. As a discipline, it focuses on the tools, processes and methods needed to design, implement and test complete systems, and to adapt existing systems as their environment evolves. My book has become the standard textbook and reference since it was published in 2001.

Security engineering is not just concerned with `infrastructure' matters such as firewalls and PKI. It's also about specific applications, such as banking and medical record-keeping, and about embedded systems such as automatic teller machines and burglar alarms. It's usually done badly: it often takes several attempts to get a design right. It is also hard to learn: although there were good books on a number of the component technologies, such as cryptography and operating systems security, there was little about how to use them effectively, and even less about how to make them work together. It's hardly surprising that most systems don't fail because the mechanisms are weak, but because they're used wrong.

My book was an attempt to help the working engineer to do better. As well as the basic science, it contains details of many typical applications - and lot of case histories of how their protection mechanisms failed. It contains a fair amount of new material, as well as accounts of a number of technologies which aren't well described in the accessible literature. Writing it was also pivotal in founding the now-flourishing field of information security economics: I realised that the critical narrative had to do with incentives and organisation at least as often as with ciphers and access control. This led me to spend a fair proportion of my research time since on the interface between security and the social sciences.

More ...


Contact details

University of Cambridge Computer Laboratory
JJ Thomson Avenue
Cambridge CB3 0FD, England

E-mail: Ross.Anderson@cl.cam.ac.uk
Tel: +44 1223 33 47 33
Fax: +44 1223 33 46 78

I don't execute programs sent by strangers without good reason. So I don't read attachments in formats such as Word, unless by prior arrangement. I also discard html-only emails, as most of them are spam; and emails asking for `summer research positions' or `internships', which we don't do.

If you're contacting me about coming to Cambridge to do a PhD, please read the relevant web pages first.