15 Reasons Why Mould is More Prevalent Today

When I tell people that, since becoming sick with mould illness, I'm now sensitive to water-damaged buildings, I often hear, "There never used to be such an issue with mould. Why are we hearing so much about it now? Is it just social media?" The truth is, there ARE more mould and moisture issues in our residential and commercial buildings today, and it's a growing problem.

Mould is an integral part of the natural ecosystem and plays a critical role in decomposing organic materials. In outdoor environments, mould growth is regulated by various factors such as sunlight, airflow, and competition with other microorganisms. However, when mould finds its way indoors, it can encounter conditions vastly different from its natural habitat. Indoors, the typical checks and balances provided by the outdoor environment are absent. This lack of regulation allows indoor mould to flourish, posing potential risks to health, property contents, and structural integrity if left unchecked. In water-damaged buildings, mould is often accompanied by various other toxins, including fungi, bacteria (like actinomycetes), mycotoxins, antigens, volatile organic compounds (VOCs), and endotoxins. Since World War II, the prevalence of mould-related issues has increased significantly, with estimates suggesting that 18% to 50% of all buildings are affected, and there are a number of reasons for this. I've outlined some of them below:


#1: WWII construction boom   

The unprecedented surge in housing and construction activity following World War II played a significant role in exacerbating mould issues, primarily due to shifts in building materials and construction methods. The post-war period witnessed the emergence of a new trend towards constructing airtight and energy-efficient buildings. While these structures effectively reduced energy consumption, they inadvertently created the perfect breeding ground for mould growth by limiting air exchange and ventilation.

In their quest to save energy, builders started using synthetic materials like insulation, vapour barriers, and sealants. As it turns out, these materials didn't just keep the cold out; they also kept the moisture in. The synthetic components altered the way buildings managed moisture, transitioning from allowing structures to 'breathe'—meaning moisture could naturally pass through—to trapping moisture within the building envelope. Post-war construction advancements also prioritised speed and cost-effectiveness, often using materials more susceptible to moisture damage and mould growth, such as gyprock.

In the 1970s, amidst a global energy crisis marked by severe oil shortages and soaring power prices, there was a renewed focus on residential energy conservation. This drive prompted additional changes in construction methods to cut down on energy expenses related to heating, cooling, and lighting. However, these alterations inadvertently reduced natural ventilation and air exchange compared to older, more drafty homes. Consequently, there was a rise in trapped moisture, increasing mould growth.

#2: Increased use of gyprock over plaster

In the mid-1940s, the demand for quicker and more cost-effective construction methods ushered in the era of paper-faced gypsum board, also known as gyprock, drywall, or gypsum wallboard. This innovation replaced traditional plaster as the go-to material for interior walls and ceilings, slashing costs by roughly 50%. Gyprock comprises a gypsum core sandwiched between two layers of paper. These organic paper layers and adhesives create an ideal environment for mould growth when they become moist. In contrast, traditional plaster, made primarily of powdered lime, sand, and fibres forms a solid layer about an inch thick. Plaster contains minimal organic matter, making it inhospitable to mould. When wet, plaster remains alkaline, inhibiting microbial growth and allowing moisture to permeate through.

Unlike plaster, gyprock readily absorbs water and dries slowly once wet, providing a perfect breeding ground for mould, which can proliferate within 24 to 48 hours of exposure to water. Moreover, gyprock's high cellulose content makes it particularly conducive to the growth of cellulose-loving toxic moulds like Stachybotrys ("Black Mould"). By the late 1980s, most interior surfaces in homes were constructed with gyprock, marking a significant shift from mould-resistant materials like plaster, brick, and stone. Some evidence also suggests that gyprock is often pre-contaminated with toxic mould species embedded in the paper layers before it even arrives at the construction site, highlighting the challenges associated with its use.

#3: Use of synthetic timbers as an alternative to solid wood

The surge in the use of synthetic timbers like MDF (medium-density fiberboard), OSB (oriented strand board), plywood, and particle board began around the mid-20th century. Originally conceived as alternatives to traditional solid wood, these synthetic options gained popularity in construction and manufacturing due to their cost-effectiveness, versatility, and ease of production. Plywood gained favour in the 1930s and 1940s, followed by particle board in the 1950s, MDF in the 1960s, and OSB from the 1980s onward. Their widespread usage has come to symbolise contemporary construction and carpentry methods. However, unlike solid timbers with naturally occurring lignins that resist mould growth, synthetic timbers lack these compounds, rendering them more susceptible to mould contamination as they readily absorb moisture, leading to swelling and facilitating mould growth. 

In Australia, the recent surge in building material prices post-COVID-19 has also compelled many builders and homeowners to opt for cheaper options, for example, pinewood over solid hardwoods. 


#4: Modern energy-efficiency schemes

Since 1998, Australia's energy-efficiency building regulation development has focussed on the need to reduce greenhouse gas emissions associated with electricity or gas consumption for the heating and cooling of building interiors. In 2004, regulatory changes to the National Construction Code (NCC) introduced the Nationwide House Energy Rating Scheme (NatHERS), implementing a star rating system to curb emissions from heating and cooling. Pre-2003, most homes were rated between 1.7-2.3 stars, but the changes mandated a minimum 4-star rating (subsequently increased to 5, 6, and then 7 stars). About 90% of new home designs now undergo assessment under this scheme and they are considered to be very 'airtight'.

As buildings have tightened up, excess condensation has become a growing issue, leading to mould growth even in many newly occupied homes. According to a study by Dewsbury & Law (2016), up to 50% of Australian houses constructed in the preceding decade were found to be affected by condensation issues, often by their first winter. In addition to trapping moisture, this 'tightening' of buildings also inadvertently slowed down the drying process for materials like drywall when they became wet, making mould growth a prevalent problem even after minor water intrusion events.


#5: Reduced ventilation standards

In 1981, during the energy crisis, the American Society of Heating, Refrigeration and Air Conditioning Engineers (ASHRAE) reduced the required outside make-up air from 15 cfm (cubic feet per minute) to just 5 cfm per person. While HVAC technicians in Australia follow their own standards, ASHRAE standards are influential outside of just America. This change, noted in the ventilation standard ASHRAE 62-1981, disregarded earlier findings on the health risks of low ventilation rates. As a result, the 1980s witnessed the emergence of indoor air quality issues and the well-known "sick building syndrome", now recognised as CIRS-WDB.

Although ASHRAE withdrew its 62-1981 standard in 1983 due to rising health concerns, numerous buildings continued to be erected for several years, adhering to the unhealthy ventilation guidelines. Construction of poorly ventilated buildings persisted throughout the 1980s until ASHRAE formally reissued the standard in 1989. This delay in updating ventilation standards prolonged poor indoor air quality in many buildings, worsening health issues among occupants.

#6: Introduction of Air Conditioning 

Since the 1970s, air conditioning has become commonplace in Western societies. Today, many buildings, especially commercial ones, are tightly sealed and depend on mechanical ventilation rather than natural airflow. Unfortunately, air conditioners have a well-earned reputation for spreading contaminated air within buildings. Dust buildup on cooling coils and duct surfaces, combined with condensation, can turn air conditioning units into breeding grounds for mould. Mould spores from the system itself, or spores from different parts of a contaminated building, can then be circulated throughout the building, spreading contamination to a broader area.

Commercial buildings, where air conditioning systems are often less meticulously maintained, are especially prone to this issue. This can make it challenging for individuals sensitive to mould to visit some shops, shopping centres without risking exposure. Even car air conditioning systems are known to frequently produce symptoms in those sensitised to mould.

Air conditioners can also cause leaks due to equipment failures, with clogged condensate lines being a common problem. Furthermore, using air conditioners for summertime cooling can create unexpected condensation risks, altering the typical flow of water vapour throughout the year and affecting building structures.

#7: Wall-to-wall Carpet

Wall-to-wall carpeting refers to carpeting that covers an entire room from wall to wall, unlike smaller area rugs that cover specific sections of the floor. This type of carpeting gained popularity and became more affordable in the years following WWII, continuing to rise in popularity through the 1950s and 1960s.

Carpets act as a reservoir for dust and are notorious for trapping mould spores and other allergens and pollutants. Simply walking on carpeted floors can resuspend these particles into the air. The dust and lint that accumulates in carpets is hygroscopic, meaning it tends to absorb moisture in damp environments. When the relative humidity at the floor surface exceeds 60%, the dust can absorb enough moisture to support fungal growth.

In addition to humidity, carpets can trap moisture from spills, leaks, or high humidity, creating an ideal environment for mould growth. If the carpet doesn't dry completely after being exposed to moisture, mould can thrive and will often remain hidden from view on the underside. 

#8: Shift towards cheaper, less-durable waterproofing systems

From the 1980s onward, there was a significant shift in waterproofing methods. From sheet-based, durable waterproofing membranes, which had a service life of approximately 15-25 years, to liquid-based coatings made from acrylic (with a service life of 6-8 years) or polyurethane (with a service life of 10-12 years), this transition led to a marked increase in bathroom, laundry, and other wet area leaks. 

In the last decade, changes in obtaining waterproofing qualifications have also impacted the quality of waterproofing works. Some can now earn certification through online training online, and others get their qualifications through Recognition of Prior Learning, often based on information passed down from their employer, such as a builder who may not be an expert in waterproofing. Additionally, the increased use of natural, citrus-based cleaning products and bleaching agents may inadvertently degrade the structural integrity of waterproofing membranes and should be avoided in areas with these membranes.

#9: Braided flexi hoses

Approximately 25 years ago in Australia, flexible braided hoses began replacing copper pipes for connecting water to taps and fixtures, commonly found under bathroom sinks and in laundries and kitchens. While they are cheaper and easier to install, saving on labour costs, these flexi hoses have a short service life of 5-8 years. They have become the leading cause of home flooding, accounting for 22% of all insurance claims in the country. Many are unaware that Flexi hoses need to be regularly inspected for signs of degradation and be replaced within their designated service life.


#10: Construction moisture from fast construction methods

The increasing demand for rapid construction often results in insufficient curing time for concrete, leading to excess indoor humidity as uncontrolled evaporating moisture is released indoors. For instance, a 100 mm thick floor slab requires at least 4 months to dry, while a 150 mm thick slab may take between 6 and 9 months. Considering that 1m³ of concrete requires approximately 210 litres of water, and approximately 120 litres will remain after curing, this leaves 90 litres to evaporate into the environment, potentially contributing to indoor mould growth.

High-rise construction techniques can exacerbate this issue. Although concrete takes months to cure fully, construction often proceeds as soon as the floors and columns are firm enough to support additional structures, enclosing the concrete prematurely. As a result, new houses and apartments made from concrete or concrete masonry can experience high internal moisture levels for up to two years post-construction. This can pose significant issues in energy-efficient buildings if not appropriately managed with dehumidification and ventilation strategies, and can allow mould to thrive. 

While concrete itself does not favour mould growth due to its alkalinity, building materials in contact with wet concrete can promote mould development.  Installing hard surface flooring or textile carpets over insufficiently cured concrete can also result in chemical reactions that produce alcohols and other volatile organic compounds, adversely impacting health.

#11: More flat roofs and absence of eaves

Pitched roofs and eaves are highly effective at protecting against wind-driven rain due to their sloped design, which allows water to run off quickly. In contrast, flat roofs require precise design and consistent maintenance to prevent water damage. The primary issue with flat roofs is their minimal incline, which hinders natural drainage and can lead to water accumulation and puddling. Standing water can cause leaks and damage roofing materials over time without proper drainage.

Flat roofs often rely on a waterproof membrane to prevent water infiltration. However, exposure to the elements can cause these membranes to degrade, crack, or develop holes over time. Similarly, the sealants used to secure seams and joints can deteriorate, leading to leaks. Flat roofs are also more prone to debris accumulation than pitched roofs. Leaves, twigs, dirt, and other debris can collect on the roof surface, obstructing drainage pathways and often remaining unseen.


#12: Climate change

Climate change may contribute to escalating extreme weather phenomena, such as heavy rainfall and flooding. While climate change isn't the sole driver, it can impact mould proliferation by altering temperatures and precipitation patterns. These changes can elevate indoor and outdoor humidity levels, fostering a more hospitable environment for mould growth, especially in poorly ventilated buildings. In Sydney for example, the heavy rainfall and sustained high humidity over the past few years has led to increased mould growth in homes that previously did not experience such issues. Intense weather events like heavy rainfall and flooding can also result in water seepage within buildings.


#13: Ageing and poorly maintained infrastructure

Poor maintenance practices can exacerbate the vulnerability of buildings to water damage. Amidst the rising cost of homes and living expenses, many struggle to afford ongoing maintenance. Landlords may also be reluctant to cover these costs or respond quickly to water damage - especially in competitive leasing markets where tenants have limited bargaining power - leading to increasingly poor rental conditions. There has also yet to be a consensus worldwide on acceptable levels of mould exposure for individuals, complicating efforts to enforce standards and conduct inspections.

Additionally, the mould remediation industry remains largely unregulated, which can lead to inadequate mould removal and enable building owners to claim that issues have been resolved without proper oversight.

#14: Mutating Fungi

Evidence suggests that moulds are not only thriving more in modern times but also becoming more toxigenic. In the 1960s and 1970s, adding fungicides to paint became a common practice. While introduced to curb fungal growth, especially in areas prone to high humidity and moisture, this inadvertently triggered mutations in the small percentage of fungi that survived the treatment. Between 1940 and 1970, the plant protection industry also began using agricultural fungicides, which increased the mutagenicity of fungi, leading to fungal adaptations and resistance.

In the 1960s, the development of benzimidazoles (used as a fungicide) contributed to mutagenicity by targeting specific proteins in fungal pathogens, which created strong selective pressure for the development of resistant strains. Benzimidazoles killed susceptible fungi but allowed any fungi with mutations in the β-tubulin gene (that prevented benzimidazole binding) to survive. These surviving fungi had a selective advantage because they could reproduce in the presence of the fungicide.  The emergence of benzimidazole resistance in fungi was noted as early as 1971, creating an environment where only fungi capable of rapid genetic adaptation could thrive, thereby increasing the overall mutagenicity and resilience of these moulds.

The extensive use of fungicides may have contributed to the evolution of common moulds and other microbial agents, rendering them more dangerous. This phenomenon is similar to the development of antibiotic-resistant bacteria in the medical field.

#15: Increased use of wallpaper

Before the early 1900s, wallpaper often contained arsenic for colour vibrancy and mercury for surface shine. While these heavy metals posed significant health risks, they inadvertently inhibited fungal growth. Once the health risks became known and these metals were phased out, modern wallpaper backings became more susceptible to mould. Wallpaper can trap moisture originating from high humidity, water leaks, or wall surface condensation between the wall and the wallpaper.

Many wallpapers are also made from or contain organic materials like cellulose, which provides a food source for mould spores. The glue or paste used to adhere wallpaper can also be organic, further feeding mould growth. If the conditions are right, mould can develop behind the wallpaper, hidden from view, making it difficult to detect and address until it has spread significantly. 


And there you have it! It's not just in our heads; there are many reasons why mould is a bigger issue in buildings today.

Previous
Previous

Does my home insurance policy cover mould?

Next
Next

What Are the Signs of Mould and Moisture in Your Home?