next up previous
Next: Cultural problems with privacy Up: Information technology in medical Previous: Privacy failure - an

Why do these failures happen?

It is my observation that many of the safety and privacy failures within the health system that have recently alarmed doctors and public in the UK can be traced to the civil service culture of the NHS's computing organisation, which tackles projects in a completely different way from private sector companies that develop safety critical systems for air traffic control, nuclear plant management and motor vehicle electronics. The private sector operates on the basis that successful system construction and operation requires a high degree of openness between users and developers. This is vital to communicate effectively about what precisely is required, what sort of failures have occurred or are likely to, and how the resulting risks can be managed. Systems must also support operational openness, so that there is constant feedback about what errors have occurred, and provide for mutual vigilance.

Perhaps the textbook example of safety culture is air traffic control, which is documented (for example) by Shapiro et al [2]. Here, controllers and chiefs work together in a highly cooperative way, sharing information resources and keeping an eye on each others' work (as well as the work of controllers and chiefs in neighbouring airspace sectors). The philosophy is that everyone makes mistakes, so it is vital that as many of these as possible should be caught by others, without egos getting in the way. A good controller is not just one who spots and points out others' mistakes but who, when he spots a mistake he himself has made, admits it at once and corrects it publicly. There is a continuous effort to maintain openness and honesty; the controllers who are failed are those who, having made a mistake, try to put it right quietly and incrementally.

By contrast, the UK civil service culture is one of secrecy and blame shifting. This is particularly evident in the NHS's attempts to deal with the `Y2K' problem (the `Year 2000' problem, or `millenium bug') - the fact that many systems record the year in two digits only and will break down when first confronted with a date after the turn of the millenium. For example, many of the transfusion pumps used in the NHS will fail; they are set to become inactive if more than six months has passed since they were last serviced, and the date used to measure this is only two digits [3].

It is interesting to compare the responses of the British and Dutch health services to this problem. The Dutch made a thorough study of one hospital, coming up with 9000 items that needed fixing; this information was then shared with other hospitals. In Britain, on the other hand, the NHS Executive has been sending out regular circulars since 1996 telling hospital trust chief executives that although there may be a problem, it is up to them to solve it without any help, or extra budget, from the centre. The government body with a statutory responsibility for the safety of things like transfusion pumps, the Medical Devices Agency, takes the following line: `MDA believes that it would be irresponsible to set up any sort of general clearing house for information, since we could not verify information on numerous models and their possible variants, and it would be irresponsible to disseminate unverified claims that particular models are year 2000 compliant' [4].

Vendors of computer-driven equipment are generally reticent about problems, despite the fact that Section 6 of Britain's Health and Safety at Work Act holds suppliers of equipment liable to warn customers of any potential hazards. In addition, most hospital managers refuse to let staff identify defective equipment to colleagues in other hospitals, for fear of legal action by suppliers should this warning prove incorrect. As a result there is a massive duplication of testing effort. Current unofficial estimates of the likely cost of failing to deal with the Y2K problem range from 600 to 1500 lives (lost through failures of medical equipment, radio systems used by ambulances, databases containing vital medical records) and up to GBP 600 million [5]. This assumes that there will be no significant failures of electricity distribution, transport systems and other critical infrastructure, the risk of which leaves us with an urgent need for coordinated contingency planning and supply chain management and little time in which to do it.

Of course, the London Ambulance Service and Y2K problems are only the most dramatic manifestations of an inappropriate culture leading to safety failure. There are many others, notably in hospital records and in the management of recall programs for cervical cancer screening [6,7].


next up previous
Next: Cultural problems with privacy Up: Information technology in medical Previous: Privacy failure - an
Ross Anderson
1998-11-13