Enlarge/ Scientists at the University of Nottingham have discovered how to create different colors of blue cheese.
University of Nottingham
Gourmands are well aware of the many varieties of blue cheese, known by the blue-green veins that ripple through the cheese. Different kinds of blue cheese have distinctive flavor profiles: they can be mild or strong, sweet or salty, for example. Soon we might be able to buy blue cheeses that belie the name and sport veins of different colors: perhaps yellow-green, reddish-brown-pink, or lighter/darker shades of blue, according to a recent paper published in the journal Science of Food.
“We’ve been interested in cheese fungi for over 10 years, and traditionally when you develop mould-ripened cheeses, you get blue cheeses such as Stilton, Roquefort, and Gorgonzola, which use fixed strains of fungi that are blue-green in color,” said co-author Paul Dyer of the University of Nottingham of this latest research. “We wanted to see if we could develop new strains with new flavors and appearances.”
Blue cheese has been around for a very long time. Legend has it that a young boy left his bread and ewe’s milk cheese in a nearby cave to pursue a lovely young lady he’d spotted in the distance. Months later, he came back to the cave and found it had molded into Roquefort. It’s a fanciful tale, but scholars think the basic idea is sound: people used to store cheeses in caves because their temperature and moisture levels were especially hospitable to harmless molds. That was bolstered by a 2021 analysis of paleofeces that found evidence that Iron Age salt miners in Hallstatt (Austria) between 800 and 400 BCE were already eating blue cheese and quaffing beer.
The manufacturing process for blue cheese is largely the same as for any cheese, with a few crucial additional steps. It requires cultivation of Penicillium roqueforti, a mold that thrives on exposure to oxygen. The P. roqueforti is added to the cheese, sometimes before curds form and sometimes mixed in with curds after they form. The cheese is then aged in a temperature-controlled environment. Lactic acid bacteria trigger the initial fermentation but eventually die off, and the P. roqueforti take over as secondary fermenters. Piercing the curds forms air tunnels in the cheese, and the mold grows along those surfaces to produce blue cheese’s signature veining.
Once scientists published the complete genome for P. roqueforti, it opened up opportunities for studying this blue cheese fungus, per Dyer et al. Different strains “can have different colony cultures and textures, with commercial strains being sold partly on the basis of color development,” they wrote. This coloration comes from pigments in the coatings of the spores that form as the colony grows. Dyer and his co-authors set out to determine the genetic basis of this pigment formation in the hopes of producing altered strains with different spore coat colors.
The team identified a specific biochemical pathway, beginning with a white color that gradually goes from yellow-green, red-brown-pink, dark brown, light blue, and ultimately that iconic dark blue-green. They used targeted gene deletion to block pigment biosynthesis genes at various points in this pathway. This altered the spore color, providing a proof of principle without adversely affecting the production of flavor volatiles and levels of secondary metabolites called mycotoxins. (The latter are present in low enough concentrations in blue cheese so as not to be a health risk for humans, and the team wanted to ensure those concentrations remained low.)
Enlarge/ (left) Spectrum of color strains produced in Pencillium roqueforti. (right) Cross sections of cheeses made with the original (dark blue-green) or new color (red-brown, bright green, white albino) strains of the fungus.
University of Nottingham
However, food industry regulations prohibit gene-deletion fungal strains for commercial cheese production. So Dyer et al. used UV mutagenesis—essentially “inducing sexual reproduction in the fungus,” per Dyer—to produce non-GMO mutant strains of the fungi to create “blue” cheeses of different colors, without increasing mycotoxin levels or impacting the volatile compounds responsible for flavor.
“The interesting part was that once we went on to make some cheese, we then did some taste trials with volunteers from across the wider university, and we found that when people were trying the lighter colored strains they thought they tasted more mild,” said Dyer. “Whereas they thought the darker strain had a more intense flavor. Similarly, with the more reddish-brown and a light green one, people thought they had a fruity, tangy element to them—whereas, according to the lab instruments, they were very similar in flavor. This shows that people do perceive taste not only from what they taste but also by what they see.”
Dyer’s team is hoping to work with local cheese makers in Nottingham and Scotland, setting up a spinoff company in hopes of commercializing the mutant strains. And there could be other modifications on the horizon. “Producers could almost dial up their list of desirable characteristics—more or less color, faster or slower growth rate, acidity differences,” Donald Glover of the University of Queensland in Australia, who was not involved in the research, told New Scientist.
The metals that form the foundation of modern society also cause a number of problems. Separating the metals we want from other minerals is often energy-intensive and can leave behind large volumes of toxic waste. Getting them in a pure form can often require a second and considerable energy input, boosting the associated carbon emissions.
A team of researchers from Germany has now figured out how to handle some of these problems for a specific class of mining waste created during aluminum production. Their method relies on hydrogen and electricity, which can both be sourced from renewable power and extracts iron and potentially other metals from the waste. What’s left behind may still be toxic but isn’t as environmentally damaging.
Out of the mud
The first step in aluminum production is the isolation of aluminum oxide from the other materials in the ore. This leaves behind a material known as red mud; it’s estimated that nearly 200 million tonnes are produced annually. While the red color comes from the iron oxides present, there are a lot of other materials in it, some of which can be toxic. And the process of isolating the aluminum oxide leaves the material with a very basic pH.
All of these features mean that the red mud generally can’t (or at least shouldn’t) be returned to the environment. It’s generally kept in containment ponds—globally, these are estimated to house 4 billion tonnes of red mud, and many containment pods have burst over the years.
The iron oxides can account for over half the weight of red mud in some locations, potentially making it a good source of iron. Traditional methods have processed iron ores by reacting them with carbon, leading to the release of carbon dioxide. But there have been efforts made to develop “green steel” production in which this step is replaced by a reaction with hydrogen, leaving water as the primary byproduct. Since hydrogen can be made from water using renewable electricity, this has the potential to eliminate a lot of the carbon emissions associated with iron production.
The team from Germany decided to test a method of green steel production on red mud. They heated some of the material in an electric arc furnace under an atmosphere that was mostly argon (which wouldn’t react with anything) and hydrogen (at 10 percent of the mix).
Pumping (out) iron
The reaction was remarkably quick. Within a few minutes, metallic iron nodules started appearing in the mixture. The iron production was largely complete by about 10 minutes. The iron was remarkably pure, at about 98 percent of the material by weight in the nodules being iron.
Starting with a 15-gram sample of red mud, the process reduced this to 8.8 grams, as lots of the oxygen in the material was liberated in the form of water. (It’s worth noting that this water could be cycled back to hydrogen production, closing the loop on this aspect of the process.) Of that 8.8 grams, about 2.6 (30 percent) was in the form of iron.
The research found that there are also some small bits of relatively pure titanium formed in the mix. So, there’s a chance that this can be used in the production of additional metals, although the process would probably need to be optimized to boost the yield of anything other than iron.
The good news is that there’s much less red mud left to worry about after this. Depending on the source of the original aluminum-containing ore, some of this may include relatively high concentrations of valuable materials, such as rare earth minerals. The downside is that any toxic materials in the original ore are going to be significantly more concentrated.
As a small plus, the process also neutralizes the pH of the remaining residue. So, that’s at least one less thing to worry about.
The downside is that the process is incredibly energy-intensive, both in producing the hydrogen required and running the arc furnace. The cost of that energy makes things economically challenging. That’s partly offset by the lower processing costs—the ore has already been obtained and has a relatively high purity.
But the key feature of this is the extremely low carbon emissions. Right now, there’s no price on those in most countries, which makes the economics of this process far more difficult.
There’s rarely time to write about every cool science-y story that comes our way. So this year, we’re once again running a special Twelve Days of Christmas series of posts, highlighting one science story that fell through the cracks in 2020, each day from December 25 through January 5. Today: Archaeologists relied on chemical clues and techniques like FTIR spectroscopy and archaeomagnetic analysis to reconstruct the burning of Jerusalem by Babylonian forces around 586 BCE.
Archaeologists have uncovered new evidence in support of Biblical accounts of the siege and burning of the city of Jerusalem by the Babylonians around 586 BCE, according to a September paper published in the Journal of Archaeological Science.
The Hebrew bible contains the only account of this momentous event, which included the destruction of Solomon’s Temple. “The Babylonian chronicles from these years were not preserved,” co-author Nitsan Shalom of Tel Aviv University in Israel told New Scientist. According to the biblical account, “There was a violent and complete destruction, the whole city was burned and it stayed completely empty, like the descriptions you see in [the Book of] Lamentations about the city deserted and in complete misery.”
Judah was a vassal kingdom of Babylon during the late 7th century BCE, under the rule of Nebuchadnezzar II. This did not sit well with Judah’s king, Jehoiakim, who revolted against the Babylonian king in 601 BCE despite being warned not to do so by the prophet Jeremiah. He stopped paying the required tribute and sided with Egypt when Nebuchadnezzar tried (and failed) to in invade that country. Jehoiakim died and his son Jeconiah succeeded him when Nebuchadnezzar’s forces besieged Jerusalem in 597 BCE. The city was pillaged and Jeconiah surrendered and was deported to Babylon for his trouble, along with a substantial portion of Judah’s population. (The Book of Kings puts the number at 10,000.) His uncle Zedekiah became king of Judah.
Zedekiah also chafed under Babylonian rule and revolted in turn, refusing to pay the required tribute and seeking alliance with the Egyptian pharaoh Hophra. This resulted in a brutal 30-month siege by Nebuchadnezzar’s forces against Judah and its capital, Jerusalem. Eventually the Babylonians prevailed again, breaking through the city walls to conquer Jerusalem. Zedekiah was forced to watch his sons killed and was then blinded, bound, and taken to Babylon as a prisoner. This time Nebuchadnezzar was less merciful and ordered his troops to completely destroy Jerusalem and pull down the wall around 586 BCE.
There is archaeological evidence to support the account of the city being destroyed by fire, along with nearby villages and towns on the western border. Three residential structures were excavated between 1978 and 1982 and found to contain burned wooden beams dating to around 586 BCE. Archaeologists also found ash and burned wooden beams from the same time period when they excavated several structures at the Giv’ati Parking Lot archaeological site, close to the assumed location of Solomon’s Temple. Samples taken from a plaster floor showed exposure to high temperatures of at least 600 degrees Celsius
Enlarge/ Aerial view of the excavation site in Jerusalem, at the foot of the Temple Mount
Assaf Peretz/Israel Antiquities Authority
However, it wasn’t possible to determine from that evidence whether the fires were intentional or accidental, or where the fire started if it was indeed intentional. For this latest research, Shalom and her colleagues focused on the two-story Building 100 at the Giv’ati Parking Lot site. They used Fourier transform infrared (FTIR) spectroscopy—which measures the absorption of infrared light to determine to what degree a sample had been heated—and archaeomagnetic analysis, which determines whether samples containing magnetic minerals were sufficiently heated to reorient those compounds to a new magnetic north.
The analysis revealed varying degrees of exposure to high-temperature fire in three rooms (designated A, B, and C) on the bottom level of Building 100, with Room C showing the most obvious evidence. This might have been a sign that Room C was the ignition point, but there was no fire path; the burning of Room C appeared to be isolated. Combined with an earlier 2020 study on segments of the second level of the building, the authors concluded that several fires were lit in the building and the fires burned strongest in the upper floors, except for that “intense local fire” in Room C on the first level.
“When a structure burns, heat rises and is concentrated below the ceiling,” the authors wrote. “The walls and roof are therefore heated to higher temperatures than the floor.” The presence of charred beams on the floors suggest this was indeed the case: most of the heat rose to the ceiling, burning the beams until they collapsed to the floors, which otherwise were subjected to radiant heat. But the extent of the debris was likely not caused just by that collapse, suggesting that the Babylonians deliberately went back in and knocked down any remaining walls.
Furthermore, “They targeted the more important, the more famous buildings in the city,” Shalom told New Scientist, rather than destroying everything indiscriminately. “2600 years later, we’re still mourning the temple.”
While they found no evidence of additional fuels that might have served as accelerants, “we may assume the fire was intentionally ignited due to its widespread presence in all rooms and both stories of the building,” Shalom et al. concluded. “The finds within the rooms indicate there was enough flammable material (vegetal and wooden items and construction material) to make additional fuel unnecessary. The widespread presence of charred remains suggests a deliberate destruction by fire…. [T]he spread of the fire and the rapid collapse of the building indicate that the destroyers invested great efforts to completely demolish the building and take it out of use.”
Enlarge/ The Royal Institution was founded in 1799 and is still located in the same historic building at 21 Albermarle Street in London.
If you’re a fan of science, and especially science history, no trip to London is complete without visiting the Royal Institution, browsing the extensive collection of artifacts housed in the Faraday Museum and perhaps taking in an evening lecture by one of the many esteemed scientists routinely featured—including the hugely popular annual Christmas lectures. (The lecture theater may have been overhauled to meet the needs of the 21st century but walking inside still feels a bit like stepping back through time.) So what better time than the Christmas season to offer a virtual tour of some of the highlights contained within the historic walls of 21 Albemarle Street?
The Royal Institution was founded in 1799 by a group of leading British scientists. This is where Thomas Young explored the wave theory of light (at a time when the question of whether light was a particle or wave was hotly debated); John Tyndall conducted experiments in radiant heat; Lord Rayleigh discovered argon; James Dewar liquified hydrogen and invented the forerunner of the thermos; and father-and-son duo William Henry and William Lawrence Bragg invented x-ray crystallography.
No less than 14 Nobel laureates have conducted ground-breaking research at the Institution over the ensuing centuries, but the 19th century physicist Michael Faraday is a major focus. In fact, there is a full-sized replica of Faraday’s magnetic laboratory—where he made so many of his seminal discoveries—in the original basement room where he worked, complete with an old dumbwaiter from when the room was used as a servant’s hall. Its arrangement is based on an 1850s painting by one of Faraday’s friends and the room is filled with objects used by Faraday over the course of his scientific career.
The son of an English blacksmith, Faraday was apprenticed to a bookbinder at 14, a choice of profession that enabled him to read voraciously, particularly about the natural sciences. In 1813, a friend gave Faraday a ticket to hear the eminent scientist Humphry Davy lecture on electrochemistry at the Royal Institution. He was so taken by the presentation that he asked Davy to hire him. Davy initially declined, but shortly afterwards sacked his assistant for brawling, and hired Faraday to replace him. Faraday helped discover two new compounds of chlorine and carbon in those early days, learned how to make his own glass, and also invented an early version of the Bunsen burner, among other accomplishments.
Painting of the Royal Institution circa 1838, by Thomas Hosmer Shepherd.
Public domain
Michael Faraday giving one of his famous Christmas lectures.
Royal Institution
A Friday Evening Discourse at the Royal Institution; Sir James Dewar on Liquid Hydrogen, by Henry Jamyn Brooks, 1904
Public domain
The Lecture Theatre as it looks today
Faraday’s magnetic laboratory in the basement of the Royal Institution
Royal Institution
A page from one of Faraday’s notebooks
Royal Institution
Faraday was particularly interested in the new science of electromagnetism, first discovered in 1820 by Hans Christian Ørsted. In 1821, Faraday discovered electromagnetic rotation—which converts electricity into mechanical motion via a magnet—and used that underlying principle to build the first electric motor. The Royal Institution’s collection includes the only surviving electric motor that Faraday built: a wire hanging down into a glass vessel with a bar magnet at the bottom. Faraday would fill the glass with mercury (an excellent conductor), then connect his apparatus to a battery, which sent electricity through the wire in turn. This created a magnetic field around the wire, and that field’s interaction with the magnet at the bottom of the glass vessel would cause the wire to rotate in a clockwise direction.
Ten years later, Faraday succeeded in showing that a jiggling magnet could induce an electrical current in a wire. Known as the principle of the dynamo, or electromagnetic induction, it became the basis of electric generators, which convert the energy of a changing magnetic field into an electrical current. One of Faraday’s induction rings is on display, comprised of coils of wire wound on opposites sides of the ring, insulated with cotton. Passing electricity through one would briefly induce a current in the other. Also on display is one of Faraday’s generators: a bar magnet and a simple cotton-insulated tube wound with a coil of wire.
In yet another experiment, Faraday placed a piece of heavy leaded glass on a magnet’s poles to see how light would be affected by a magnet. He passed light through the glass and when he turned on the electromagnet, he found that the polarization of the light had rotated slightly. This is called the magneto-optical effect (or Faraday effect), demonstrating that magnetism is related not just to electricity, but also to light. The Royal Institution has a Faraday magneto-optical apparatus with which he “at last succeeded in… magnetizing a ray of light.” In 1845, Faraday discovered diamagnetism, a property of certain materials that give them a weak repulsion from a magnetic field.
Equipment used by Faraday to make glass
Drawing of Faraday’s electromagnetic rotation experiment.
Public domain
Faraday motor (electric magnetic rotation apparatus), 1822
Royal Institution
Faraday’s dynamo (generator), October 1831
Royal Institution
Faraday’s induction ring
Royal Institution
Faraday’s magneto-optical apparatus
Royal Institution
One of Faraday’s iron filings (1851) showing magnetic lines of force
Royal Institution
Faraday’s original gold colloids are still active well over a century later
Shining a laser light through a gold colloid mixture produces the Faraday-Tyndall Effect.
Royal Institution
Faraday concluded from all those experiments that magnetism was the center of an elaborate system of invisible curved tentacles (electric lines of force) that spread throughout space like the roots of trees branching through the earth. He was able to demonstrate these lines of force by coating sheets of paper with wax and placing them on top of bar magnets. When he sprinkled powdery iron filings on the surface, those iron filings were attracted to the magnets, revealing the lines of force. And by gently heating the waxed paper, he found that the iron filings would set on the page, preserving them.
In the 1850s, Faraday’s interests turned to the properties of light and matter. He made his own gold slides and shone light through them to observe the interactions. But commercial gold leaf, typically made by hammering the metal into thin sheets, was still much too thick for his purposes. So Faraday had to make his own via chemical means, which involved washing gold films. The resulting faint red fluid intrigued Faraday and he kept samples in bottles, shining light though the fluids and noting an intriguing “cone effect” (now known as the Faraday-Tyndall Effect)—the result of particles of gold suspended in the fluid that were much too small to see.
One might consider Faraday an early nanoscientist, since these are now known as metallic nanoparticles. The Institution’s current state-of-the-art nanotechnology lab is appropriately located right across from Faraday’s laboratory in the basement. And even though Faraday’s gold colloids are well over a century old, they remain optically active. There’s no way to figure out why this might be the case without opening the bottles but the bottles are too valuable as artifacts to justify doing that.
Plenty of other scientific luminaries have their work commemorated in the Royal Institution’s collection, including that of Faraday’s mentor, Humphry Davy, who discovered the chemical elements barium, strontium, sodium, potassium, calcium and magnesium. Early in the 19th century, there were several explosions in northern England’s coal mines caused by the lamps used by the miners accidentally igniting pockets of flammable gas. Davy was asked to come up with a safer lighting alternative.
Schematic for the Davy lamp
Public domain
Humphry Davy’s miner’s lamp (left) displayed alongside his rival George Stephenson’s lamps
Royal Institution
Schematic for John Tyndall’s radiant heat apparatus
Royal Institution
Tyndall’s radiant heat tube
Royal Institution
Tyndall’s blue sky tube, 1869
Royal Institution
Title page of Tyndall’s Heat: A Mode of Motion
Paul Wilkinson/Royal Institution
After experimenting with several prototypes, Davy finally settled on a simple design in 1815 consisting of a “chimney” made of wire gauze to enclose the flame. The gauze absorbed heat to prevent igniting flammable gas but still let through sufficient light. The invention significantly reduced fatalities among coal miners. Davy had a rival, however in a mining engineer named George Stephenson who independently developed his own design that was remarkably similar to Davy’s. Samples of both are displayed in the Institution’s lower ground floor “Light Corridor.” Davy’s lamp would ultimately triumph, while Stephenson later invented the first steam-powered railroad locomotive.
Atmospheric physicist John Tyndall was a good friend of Faraday and shared the latter’s gift for public lecture demonstrations. His experiments on radiation and the heat-absorptive power of gases were undertaken with an eye toward developing a better understanding of the physics of molecules. Among the Tyndall artifacts housed in the Royal Institution is his radiant heat tube, part of an elaborate experimental apparatus he used to measure the extent to which infrared radiation was absorbed and emitted by various gases filling its central tube. By this means he concluded that water vapor absorbs more radiant heat than atmospheric gases, and hence that vapor is crucial for moderating Earth’s climate via a natural “greenhouse effect.”
The collection also includes Tyndall’s “blue sky apparatus,” which the scientist used to explain why the sky is blue during the day and takes on red hues at sunset—namely, particles in the Earth’s atmosphere scatter sunlight and blue light is scattered more strongly than red light. (It’s the same Faraday-Tyndall effect observed when shining light through Faraday’s gold colloids.)
James Dewar in the Royal Institution, circa 1900
Public domain
A Dewar flask
Royal Institution
The x-ray spectrometer developed by William Henry Bragg.
Royal Institution
Bragg’s rock salt model
On Christmas Day, 1892, James Dewar exhibited his newly invented Dewar flask at the Royal Institution for the first time, which he used for his cryogenic experiments on liquefying gases. Back in 1872, Dewar and Peter Tait had built a vacuum-insulated vessel to keep things warm, and Dewar adapted that design for his flask, designed to keep things cold—specifically cold enough to maintain the extremely low temperatures at which gases transitioned into liquid form. Dewar failed to patent his invention, however; the patent eventually went to the Thermos company in 1904, which rebranded the product to keep liquids hot as well as cold.
As for William Henry Bragg, he studied alpha, beta, and gamma rays early in his career and hypothesized that both gamma rays and x-rays had particle-like properties. This was bolstered by Max Von Laue‘s Nobel Prize-winning discovery that crystals could diffract x-rays. Bragg and his son, William Lawrence—then a student at Trinity College Cambridge—began conducting their own experiments. Bragg pere invented a special “ionization spectrometer,” in which a crystal could be rotated to precise angles so that the different scattering patterns of x-rays could be measured. The pair used the instrument to determine the structure of crystals and molecules, winning the 1915 Nobel Prize in Physics for their efforts. That spectrometer, the prototype of today’s x-ray diffractometers, is still housed in the Royal Institution, as well as their model of the atomic structure of rock salt.
Enlarge/Great British Bake Off judges Paul Hollywood and Prue Leith (top) and presenters Alison Hammond and Noel Fielding.
Mark Bourdillon/Love Productions/Channel 4
The Great British Bake Off (TGBBO)—aka The Great British Baking Show in the US and Canada—features amateur bakers competing each week in a series of baking challenges, culminating in a single winner. The recipes include all manner of deliciously decadent concoctions, including the occasional Christmas dessert. But many of the show’s Christmas recipes might not be as bad for your health as one might think, according to a new paper published in the annual Christmas issue of the British Medical Journal, traditionally devoted to more light-hearted scientific papers.
TGBBO made its broadcast debut in 2010 on the BBC, and its popularity grew quickly and spread across the Atlantic. The show was inspired by the traditional baking competitions at English village fetes (see any British cozy murder mystery for reference). Now entering its 15th season, the current judges are Paul Hollywood and Prue Leith, with Noel Fielding and Alison Hammond serving as hosts/presenters, providing (occasionally off-color) commentary. Each week features a theme and three challenges: a signature bake, a technical challenge, and a show-stopper bake.
The four co-authors of the new BMJ study—Joshua Wallach of Emory University and Yale University’s Anant Gautam, Reshma Ramachandran, and Joseph Ross—are avid fans of TGBBO, which they declare to be “the greatest television baking competition of all time.” They are also fans of desserts in general, noting that in medieval England, the Catholic Church once issued a decree requiring Christmas pudding four weeks before Christmas. Those puddings were more stew-like, containing things like prunes, raisins, carrots, nuts, spices, grains, eggs, beef, and mutton. Hence, those puddings were arguably more “healthy” than the modern take on desserts, which contain a lot more butter and sugar in particular.
But Wallach et al. wondered whether even today’s desserts might be healthier than popularly assumed and undertook an extensive review of the existing scientific literature for their own “umbrella review.” It’s actually pretty challenging to establish direct causal links in the field of nutrition, whether we’re talking about observational studies or systemic reviews and meta-analyses. For instance, many of the former focus on individual ingredients and do not take into account the effects of overall diet and lifestyle. They also may rely on self-reporting by study participants. “Are we really going to accurately report how much Christmas desserts we frantically ate in the middle of the night, after everyone else went to bed?” the authors wrote. Systemic reviews are prone to their own weaknesses and biases.
“But bah humbug, it is Christmas and we are done being study design Scrooges,” the authors wrote, tongues tucked firmly in cheeks. “We have taken this opportunity to ignore the flaws of observational nutrition research and conduct a study that allows us to feel morally superior when we happen to enjoy eating the Christmas dessert ingredients in question (eg, chocolate). Overall, we hoped to provide evidence that we need to have Christmas dessert and eat it too, or at least evidence that will inform our collective gluttony or guilt this Christmas.”
The team scoured the TGBBO website and picked 48 dessert recipes for Christmas cakes, cookies, pastries, and puddings, such as Val’s Black Forest Yule Log, or Ruby’s Boozy Chai, Cherry and Chocolate Panettones. There were 178 unique ingredients contained in those recipes, and the authors classified those into 17 overarching ingredient groups: baking soda, powder and similar ingredients; chocolate; cheese and yogurt; coffee; eggs; butter; food coloring, flavors and extracts; fruit; milk; nuts; peanuts or peanut butter; refined flour; salt; spices; sugar; and vegetable fat.
Wallach et al. identified 46 review articles pertaining to health and nutrition regarding those classes of ingredients for their analysis. That yielded 363 associations between the ingredients and risk of death or disease, although only 149 were statistically significant. Of those 149 associations, 110 (74 percent) reduced health risks while 39 (26 percent) increased risks. The most common ingredients associated with reduced risk are fruits, coffee, and nuts, while alcohol and sugar were the most common ingredients associated with increased risk.
Take Prue Leith’s signature chocolate Yule log, for example, which is “subtly laced with Irish cream liqueur.” Most of the harmful ingredient associations were for the alcohol content, which various studies have shown to increase risk of liver cancer, gastric cancer, colon cancer, gout, and atrial fibrillation. While alcohol can evaporate during cooking or baking, in this case it’s the cream filling that contains the alcohol, which is not reduced by baking. (Leith has often expressed her preference for “boozy bakes” on the show.)
By contrast, Rav’s Frozen Fantasy Cake contains several healthy ingredients, most notably almonds and passion fruit, and thus carried a significant decreased risk for disease or death. Ditto for Paul Hollywood’s Stollen, which contains almonds, milk, and various dried fruits. “Overall, without the eggs, butter, and sugar, this dessert is essentially a fruit salad with nuts,” the authors wrote. That is, of course, a significant caveat, because the eggs, butter, and sugar kinda make the dessert. But Wallach et al. note that most of the dietary studies condemning sugar focused on the nutritional effects of sugar-sweetened beverages, and none of TGBBO Christmas dessert recipes used such beverages, “no doubt because they would have resulted in bakes with a soggy bottom.”
The BMJ study has its limitations, relying as it does on evidence from prior observational studies. Wallach et al. also did not take into account how much of each ingredient was used in any given recipe. Regardless of whether the recipe called for a single berry or an entire cup of berries, that ingredient was weighted the same in terms of its protective effects countering the presumed adverse effects of butter. Would a weighted analysis have been more accurate? Sure, but it would also have been much less fun.
So, is this a genuine Christmas miracle or an amusing academic exercise in creative rationalization? Maybe we shouldn’t overthink it. “It is Christmas so just enjoy your desserts in moderation,” the authors concluded.