When I was in university, I tossed a wet towel on top of my heap of laundry and headed home to my parents’ house for Christmas break. When I returned, everything under the towel had gone mouldy – tank tops, underwear, pajamas…I learned an expensive lesson, being forced to throw out some of my favourite clothes after realizing the black spots wouldn’t wash out.
Meanwhile, upstairs in the kitchen, my roommate was extracting a loaf of bread he had stashed in the cupboard over the break. He grabbed the peanut butter and reached for the bread. It was also mouldy. He looked at me incredulously and said, “bread goes bad?”
It does indeed. But not as fast as it used to.
“Eat only foods that will eventually rot. Real food is alive – and therefore it should eventually die.”
– Michael Pollan’s 13th food rule, in the category of “Eat Food”
All real food goes bad, with the possible exception of honey (apparently archaeologists unearthed 3000-year-old honey from Egyptian ruins, ate it, and lived to tell the tale).
Natural preservatives like salt and dehydration are traditional methods to help food stay good, longer.
From what I can piece together, artificial (non-food) preservatives legitimately entered our food supply in the early 1900’s. Between the late-1800’s discovery of bacteria and the passage of the US Pure Food and Drugs Act in 1906, it’s clear that people were increasingly concerned about the safety of their food. Technology improved; bread-making moved from the home or local bakery to the factory. Because people no longer knew their baker, they needed a way to judge whether bread they were buying was fresh and unadulterated (the adulteration of bread was a recurring problem, especially during times of famine and poverty).
It’s hard to judge whether a product is “pure” (unadulterated) and fresh without watching it be made. So, the average consumer resorted to softness and appearance as proxies. And what makes bread feel soft and look “pure”? Preservatives and bleached white flour.
Rather than yearning for the hearth of the olden days, the 1930’s consumer welcomed factory-made bread as an
antidote to potential adulteration and dirtiness associated with the human (especially immigrant human) hand. Technology meant machines could bake, wrap, and slice the bread without a person laying a finger on it. People were convinced their homes could never be clean enough to bake in; best leave the bread-making to the factories.
The perfect storm of food safety fears, industrial production, and technology made factory bread and preservatives not only possible – but the norm – much earlier than I thought. We’ve been judging a loaf by its crust since the turn of the 20th century.
Realistically, I know preservatives have their place: food scientists would say preservatives mean fewer people get sick. Thanks to preservatives, you can have “fresh” bread every day of the week without shopping every day. Less food spoilage mean less food waste and better value for money. That’s one side of the story – but where’s the other? Who’s looking into the safety of these preservatives? How preservatives affect nutrition? And what potential health effects they may have? Those voices are typically much quieter, but I’ll try to give them a stage in my next post.
References and further reading:
- Food Rules: An Eater’s Manual by Michael Pollan (2009)
- Pure Foods and Drugs Act (PubMed synopsis)
- Swindled: The Dark History of Food Fraud, from Poisoned Candy to Counterfeit Coffee by Bee Wilson (2008).
- White Bread A Social History of the Store-Bought Loaf by Aaron Bobrow-Strain (2012).
Leave a Reply