Turning points of disaster: the evolution of regulation in Pharma

14 April 2020

Regulations in pharmaceutical Quality Assurance and Quality Control play a pivotal role in manufacturing medicines today - but it hasn’t always been this way. Today’s regulations are strict for a reason; before GMP, the MHRA and even the FDA (as we know it now), the production of drugs was relatively unregulated. Not surprisingly, this led to some very serious errors and tragic circumstances, and in response to this, some turning points for the industry which have undoubtedly since saved thousands of lives.

Preventable deaths and the birth of regulation

The first significant event to note was in 1902, when Congress passed the Biologics Control Act in the United States, requiring the testing of products for purity and strength before they were sold. The testing of products was in response to what is now recognised as the first modern medical disaster, whereby contaminated serum was used to prepare an antitoxin for Diphtheria. What should have been relatively straightforward recoveries resulted in the preventable deaths of at least 12 children due to Tetanus.

diphtheria antitoxin

Chemist Harvey Wiley had been persistently campaigning over a period of 25 years ahead of this disaster for the introduction of a stricter law around regulation. Following this tragedy, the Food and Drugs Act – also known as the Wiley Act - was signed by President Roosevelt in 1906. This marked the start of the FDA’s modern regulatory functions and set out requirements including that ingredients considered dangerous should be labelled on all drugs.

Harvey Wiley Food and Drugs Act

Disaster strikes again

The 1906 Food and Drugs Act remained more or less the same until in 1933 when the FDA recommended a complete revision; there were apparent gaps in the 1906 Act, with some potentially hazardous products ruled permissible. It took five years to pass these amends through congress due to disagreements as to what should replace it, by which time another tragedy had occurred; the Sulphanilamide Disaster, as it is now known.

Over 100 people died as a result of using Sulphanilamide, which for some time previously had been used to treat streptococcal infections safely and effectively in powder and tablet form. However, in June 1937, a salesman reported demand for a liquid form; the company’s pharmacist Harold Watkins found the drug dissolved in diethylene glycol – usually used for antifreeze - and subsequently, after it had been tested for appearance, flavour and fragrance, it was deemed satisfactory to be compounded. 633 shipments were sent across America immediately, despite the medicine not being tested for toxicity.

Whilst the food and drugs law did require safety studies be carried out, selling toxic drugs was not illegal. However, what was illegal was misbranding the drugs as an “Elixir”, implying the solution was alcohol-based, rather than diethylene glycol. 25 of these so called “Elixirs” were seized under federal law as a result. Had the medicine been labelled as a “solution” however, the FDA would have had no authority to recover the drug and more people would have died.

Sulphanilamide in its liquid form, which killed hundreds

As a result of this disaster, in 1938, the Federal Food, Drug and Cosmetic Act was passed, containing new provisions. The act included showing new drugs were safe before marketing, setting tolerances to avoid distribution of poisonous substances and authorising factory inspections. When in 1962 history repeated itself in the form of the Thalidomide disaster – resulting in the deformity of thousands of European babies – this Act was said to have saved America as the drug was not approved in the US due to FDA reviewer Frances Kelsey’s concerns over thyroid toxicity.

Europe’s turning point

In Europe, however, where there were little more than a fragmented series of legislation’s around controlling the quality and promotion of drugs, the Thalidomide disaster became one of the biggest medical catastrophes of the century. The drug was created in Germany in 1953 and prescribed as a sedative. It was considered to be safe with no side effects and so was prescribed to pregnant women for nausea and insomnia from 1957. It subsequently became the “drug of choice” for sufferers of morning sickness and over a period of a few years, resulting in as many as 10,000 babies being born blind, with a cleft palate, bowel malformations or most notably, severe limb deformities. The drug was withdrawn from the market by most countries – including the UK – in 1961 for this use.

Thalidomide

With its enormous impact on Europe, significant changes were implemented both in the UK and America. The FDA passed Kefauver-Harris Drug amendments in 1962 to ensure greater drug safety; for the first time, manufacturers had to prove to the FDA the effectiveness of products before marketing them. That same year, the FDA was given authority to require compliance with Good Manufacturing Practice (GMP) and Good Laboratory Practice (GLP). These practices provided a system ensuring proper monitoring and control of manufacturing processes and facilities, minimising risks which could impact the quality and reproducibility of Pharmaceutical products.

In the UK, less than two years after withdrawing the drug, the Committee on Safety of Drugs (CSD) was created; the committee had no legal powers. However, its panel of medical experts had a voluntary agreement with the Pharmaceutical industry that no drug should be put on trial against CSD committee advice. Later, in 1968, the Medicines Act was passed through Parliament, governing the control of medicines, including their manufacture and supply. Following this, in 1970 the CSD became the Committee for Safety in Medicines, advising the UK licensing authority on the quality, efficacy and safety of medicines to meet appropriate public health standards. This committee was in place until as recently as 2005 when it was replaced by the Commission of Human Medicines and became part of the UK’s Medicines and Healthcare Products Regulatory Agency (MHRA), which was founded two years earlier in 2003.

A collaborative approach

In 1980, the first International Conference of Drug Regulatory Authorities was organised by the World Health Organisation (WHO), with the aim to generate a shared approach to issues of common concern. At the 1989 conference in Paris, a need for collaboration between drug regulatory authorities across member states was identified, as it was seen that globalised trade could undermine regulation. And so, the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH) was born in 1990 in Brussels. Its mission is to promote public health by achieving greater harmonisation through the development of technical guidelines and requirements for pharmaceutical product registration. Topics for harmonisation were divided into Safety, Quality and Efficacy, as the basis for approving new medicinal products. The ICH has since evolved to respond to global developments in the industry and grown as an organisation; now with 16 members and 32 observers, the organisation has increased international outreach and improved communication to a wider number of stakeholders.

What will the future look like?

As you can see, the world of Quality evolved from being almost non-existent at a national level at the start of 1900, to collaboration at a global scale today to ensure the safety of people worldwide. Going forward, it can be expected that this evolution will continue as more organisations become actively involved in the ICH’s harmonisation work and regulatory bodies such as the FDA and MHRA update guidance accordingly. Digital systems are likely to have a more prominent role in such guidance, as systems such as Microgenetics SmartControl increase efficiency in labs whilst helping them to meet data integrity guidance. To find out how you can stay ahead of the future of Quality, click here to learn more about SmartControl.