Skip to main content

Chemistry has played a vital role in our relationship with food since antiquity. How have chemists shaped the food we eat?

Woman in a white coat measuring out cornflakes in a scientific laboratory © Manchester Daily Express / Science & Society Picture Library


Chemistry is central to the history of how we produce, store and consume food, from preservation, pesticides and quality testing to artificial additives and substitutes.

And the art of cooking—transforming the qualities of ingredients and how they combine to form new textures and tastes—is itself a chemical process.  

This is the story of chemistry's impact on the food on our plate.

Chemicals in our food

Chemical food additives have a long history. In ancient China, paraffin wax was burnt to ripen fruit—this worked because it caused traces of ethylene and propylene to combine with the food. The Egyptians coloured food with saffron, while the Romans added alum (potassium aluminum sulfate) to bread to make it whiter. 

The first deliberate use of a food additive was likely salt to preserve foods such as fish and meat, which works by dehydrating the food to limit bacterial growth—but it wasn't until the 19th century that the microbial cause of food spoilage was understood.

In the 17th century, efforts were made to find a new method of food preservation without the need for additives. Robert Boyle—considered one of the pioneers of modern chemistry—led these investigations, experimenting with storing food in air-free containers. 

In the very early 19th century, a new preservation technology was developed in response to the military need for preserving food during the Napoleonic wars.

This revolutionary technology was the tin can, which combined sealing food in an air-tight container with heat sterilisation.  

Tins gallery

Explore examples of tins used for food preservation from our collection:

Canned food is not without its risks to human health, however. The food inside could become contaminated with lead (early cans were sealed with lead alloys) and tin, which acidic foods like fruits can corrode. 

To avoid this problem, some modern cans have plastic linings made from bisphenol-A (BPA)—a now-notorious chemical compound that studies have found interferes with our hormones. Manufacturers have removed BPA from their can linings in recent years in response to these findings.

Food additives in the age of industry

The use of food additives increased dramatically during the Industrial Revolution, with toxic compounds used liberally in factory food production. These ranged from the colouring of Gloucester cheese with red lead to sweets being coloured green with copper arsenite—also used in the wallpaper that may have have been a factor in Napoleon's death. 

Sample of red lead formed in a smelting furnace, from Alport Lead Works, Derbyshire.
Science Museum Group More information

As the food processing industry grew and chemists synthesised new artificial thickeners, emulsifiers, colours and flavours, regulatory bodies were formed to control the adulteration of food for human consumption. 

Concern about the toxicity and carcinogenicity of additives intensified in the middle of the 20th century, as analytic chemistry made detecting and measuring additives easier.

This soon led to international regulation like the European Union’s E-number system for approved additives, introduced in 1962.

Quality testing food (or, How strong is that jelly?)

As well as flavouring, colouring and preserving food, chemists and chemical techniques are also vital to ensuring what food manufacturers produce is consistent.

Many food and drink companies have specialist laboratories, dedicated to both quality testing and research and development. 

The first instrument designed specifically for quality control in food manufacture was, perhaps surprisingly, to test the consistency of fruit jelly.  

Devised by a German chemist in 1861, the jelly puncture test was soon followed by a series of improved jelly strength testers, as they became called. Driven by the practical concerns of the food industry, jelly testing led to a new field in chemistry concerned with investigating the properties of gelatinous substances.

We have two early jelly strength testers in our collection, which have a rather over-wrought elaborate appearance, not unlike one of Heath Robinson’s comic creations.

This example was used at an East London confectionery factory, Bard Brothers, which in the mid-20th century was one of Britain’s leading suppliers of fruit jelly.  

 

Jelly tester

Other food and drink manufacturers made use of different equipment.

Horlicks, the company famous for its malted milk substitute product, donated a selection of its laboratory equipment to the Science Museum, after its factory in Slough closed down in 2018.  

One of the more unusual instruments from Horlicks was a dipping refractometer—first developed in 1899 by the famous optical company Carl Zeiss. The instrument works, as the name suggests, by dipping a viewing telescope into a liquid sample inside each hole to observe how much light has been refracted.

Horlicks employed this technique to test the concentration of ingredients across different samples of their product. 

Horlicks refractometer

Artificial foods

Glass vial with handwritten label Science Museum Group Collection Image source
Vial of margaric acid from a set of chemical preparations associated with Chevreul, 1820–1840.

The invention of artificial substitute foods is a relatively recent enterprise.

Margarine was the first, developed in 1869 as a solution to Napoleon III's desire to find a cheaper alternative to butter—which was in scarce supply—for the working classes.

Its inventor, the French chemist Hippolyte Mège-Mouriès, derived his product from beef tallow, after experimenting with the fatty acid margarique. This had been discovered by his compatriot and fat chemistry pioneer Michel Eugene Chevreul. 

Mège-Mouriès sold his patent to the company that would become Unilever, the world’s largest producer of margarine.

Although it's chemically white, the margarine we know today is yellow, to imitate the colour of butter. But it hasn’t always been so. In the early 20th century laws were passed in several US states that made any colour but pink illegal, so that consumers wouldn’t be misled into thinking it was butter. 

At around this time, chemists refined the hydrogenation process (introducing hydrogen to turn oils into semi-solid fats) in the production of margarine, leading to the replacement of animal with vegetable fats.

A 1:24 model of a hydrogenation plant used in the manufacture of margarine.
Science Museum Group More information

In doing so, margarine became even cheaper. There was also a belief that such unsaturated fats were healthier than the saturated fats of animal products like butter. 

As the 20th century wore on, however, mounting evidence linked the trans-unsaturated fat of margarine with heart disease. 

Governments and industry responded. Butter substitutes now contain greater amounts of saturated fat, such as palm oil—though there are environmental consequences to this change, caused by the mass deforestation of oil palm trees. 

Yellow and green box with a sugar cane design Science Museum Group Collection
Saccharin tablets, 1970s.

As well as substitute foods like margarine, chemists have also developed substitute ingredients, such as artificial sweeteners. 

Saccharin was the first of these, named after the sugar cane genus.

Discovered in 1879 after a chemist noticed a sweet-tasting substance on his hand from experimenting with coal tar derivatives, saccharin is between 300 and 500 times sweeter than sugar. 

Its commercial success was driven by the need of people with diabetes (and later dieters) to find a sugar substitute.    

But like many chemical additives, there are concerns around the adverse effects of saccharin on health—for decades its use was banned in the USA. 

Should we worry about chemicals in our food?

Today we're increasingly concerned about the use of chemical additives or artificial foods, as recent cultural trends have led consumers to seek organic and natural products. Manufacturers have readily exploited this fashion, with labels announcing that products are free from colourings or artificial preservatives.  

In many cases, there's little evidence to suggest they are unsafe. There are much greater risks to health from microbial food poisoning, which preservatives help prevent. 

Yet we're starting to understand that chemical additives may be adversely impacting our gut flora, which has given rise to the booming biochemical probiotic industry.  

As history has shown, chemists, government regulation and medical studies are all symbiotically shaping and changing what we eat. 

 

Find out more

Books 

  • Kenneth F Kiple (ed.), The Cambridge World History of Food (two volumes), 2000
  • Harvey Levenstein, Fear of Food: A History of Why We Worry About What We Eat, 2012
  • Sue Shephard, Pickled, Potted and Canned: How the Art and Science of Food Preserving Changed the World, 2006
  • Andrew F Smith, Sugar: A Global History, 2015
  • Deborah Jean Warner, Sweet Stuff: An American History of Sweeteners From Sugar to Sucralose, 2011

Online